Compare commits

..

No commits in common. "main" and "snapshot-2026-05-08-pre-elegantfin" have entirely different histories.

49 changed files with 210 additions and 11365 deletions

1
.gitignore vendored
View file

@ -1 +0,0 @@
__pycache__/

131
README.md
View file

@ -1,90 +1,83 @@
<p align="center">
<img src="assets/logo.png" alt="ARRFLIX" width="420">
</p>
# ARRFLIX
<h3 align="center">My own premium streaming service. No compromise.</h3>
Self-hosted Jellyfin media server on nullstone, LAN-only.
---
> **Start here** → [`ADMIN-GUIDE.md`](ADMIN-GUIDE.md) — the single page that
> tells you what to do day-to-day. Everything else is a reference doc you only
> read when the admin guide tells you to.
ARRFLIX is my personal streaming service. One library, hand-curated, no
filler — every show and film is the best version I could put together. Where
the source allows, masters are 4K. Where it doesn't, they're AI-upscaled until
they look better than the disc ever did. The reference example: my **Rick and
Morty Season 1** is a 4K HDR upscale that beats the original broadcast. That's
the standard for everything that lands here.
## Endpoint
It's not a clone of a public streamer. It's the version I wished existed: the
quality bar of a boutique release group, the polish of a flagship app, and a
library I actually want to watch.
- `https://arrflix.s8n.ru` — accessible only from LAN (192.168.0.0/24) and Tailscale admin/infra tags via Traefik `no-guest@file` middleware.
- DNS resolved internally by Pi-hole (`/opt/docker/pihole/etc-pihole/custom.list`).
- TLS via Let's Encrypt DNS-01 (Gandi).
---
## Storage
<p align="center">
<img src="assets/screenshots/02-detail-mandalorian.png" alt="ARRFLIX detail page — The Mandalorian">
<br><sub><em>Detail page — full-bleed backdrop, ARRFLIX wordmark, Netflix-grade dark UI</em></sub>
</p>
| Path | Purpose |
|-----------------------------------|-------------------------------|
| `/home/docker/jellyfin/config/` | Jellyfin config + DB (writable, UID 1000) |
| `/home/docker/jellyfin/cache/` | Transcode + image cache |
| `/home/user/media/movies/` | Movies library (mounted RO) |
| `/home/user/media/tv/` | TV library (mounted RO) |
<p align="center">
<img src="assets/screenshots/03-playback-sassy.png" alt="ARRFLIX playback — Sassy the Sasquatch">
<br><sub><em>Playback — Jellyfin chrome hidden, ARRFLIX-red scrubber + clean OSD</em></sub>
</p>
## Routing
<p align="center">
<img src="assets/screenshots/01-search.png" alt="ARRFLIX search">
<br><sub><em>Search — pinned suggestions, ARRFLIX-red accents, no filler</em></sub>
</p>
Traefik docker-label provider does NOT pick up the labels on this container
(unknown reason — file-provider routing for the same backend works). The
deploy uses **file-provider** routing in
`/opt/docker/traefik/config/jellyfin-test.yml`. If you fix the docker-provider
issue later, flip routing back to labels and remove the file-provider snippet.
---
## Transcoding
## What you get
GTX 1660 Ti is present on nullstone but `nvidia-smi` currently fails — driver
is broken or not loaded. Jellyfin runs CPU-only transcode for now. After
fixing the driver, add the standard NVIDIA hwaccel block in compose:
- **Best-quality everything.** 4K where the source supports it, AI-upscaled
masters where it doesn't. No 480p filler, no junk encodes.
- **Curated, not crawled.** Every title is hand-imported, hand-cleaned, and
hand-checked before it goes live. Junk files, sample clips, and stray
artwork never make it in.
- **Polished metadata.** Posters, backdrops, episode stills, cast, and
descriptions are all locked to the canonical source — no wrong-show
matches, no broken artwork, no foreign-language drift.
- **English-first UI, every account.** No surprise German Play buttons, no
browser-locale roulette. Every user is pinned to a consistent experience.
- **Custom theming.** ARRFLIX wordmark, ARRFLIX-red accent (`#E50914`),
loading splash, and a Netflix-grade dark UI. Jellyfin's stock chrome is
hidden — the brand is the surface.
- **Per-user home layouts.** Resume, Next Up, and Latest Media tuned the way
I actually use the app. No "My Media" tile clutter.
- **Subtitles done right.** Sidecar files named to spec, OpenSubtitles
integration, ffmpeg-extracted tracks where embedded.
```yaml
deploy:
resources:
reservations:
devices:
- driver: nvidia
count: all
capabilities: [gpu]
```
## Live at
…and enable NVENC in Jellyfin's Playback → Transcoding settings.
- <https://arrflix.s8n.ru>
## First-run setup
Endpoint is **LAN / tailnet only**. There is no public exposure — if
you're not on the network, you're not getting in. By design.
1. Browse to `https://arrflix.s8n.ru` from the LAN.
2. Create the admin user (Jellyfin onboarding wizard).
3. Add libraries pointing at `/media/movies` and `/media/tv` inside the
container (these map to `/home/user/media/{movies,tv}`).
4. (Optional) Apply Netflix-style theme — see `docs/04-theming-and-users.md`.
---
## Operations docs
## How it works (technical)
Detailed playbooks (research-grade, with API curls, failure modes, recovery):
ARRFLIX runs on self-hosted infrastructure on **nullstone**. The repo you're
looking at is also the deploy source-of-truth: the compose file, library
structure, theming overrides, and operational playbooks all live here. The
streaming engine itself is unbranded plumbing — invisible behind the
ARRFLIX surface.
| File | Topic |
|------|-------|
| [`docs/01-artwork-and-images.md`](docs/01-artwork-and-images.md) | Posters, backdrops, scrapers (TMDB/TVDB/Fanart), refresh API, language fallback |
| [`docs/02-metadata-and-titles.md`](docs/02-metadata-and-titles.md) | Filename parsing, Identify flow, locking the right show, language cascade, multi-episode files |
| [`docs/03-subtitles.md`](docs/03-subtitles.md) | OpenSubtitles plugin (.com), sidecar naming, ffmpeg/mkvextract extraction, per-user prefs |
| [`docs/04-theming-and-users.md`](docs/04-theming-and-users.md) | ElegantFin theme, branding API, multi-user policies, SyncPlay, friend account playbook |
| [`docs/05-file-structure-rules.md`](docs/05-file-structure-rules.md) | Authoritative folder/filename rules for movies, TV, anime, stand-up, concerts, docs, extras, NFO, artwork overrides |
| [`docs/06-per-library-themes.md`](docs/06-per-library-themes.md) | Per-library theming research: JS-injector plugin shim + scoped CSS for Movies/Anime/Music looks |
Operators / future-me, the technical reference is split across:
## State as of 2026-05-08
- [`ADMIN-GUIDE.md`](ADMIN-GUIDE.md) — single-page day-to-day ops: adding users,
importing media, fixing scrapes, theme breakage, emergency rollback.
- [`ROADMAP.md`](ROADMAP.md) — what's done, what's open, what's deferred.
- [`docs/`](docs/) — research-grade reference docs (artwork, metadata,
subtitles, theming, file-structure rules, per-library themes, cleanup,
filename normalization, force-English, branding leaks, splash, audits).
- **Library**: Futurama 1999 series (TMDB 615), S01S03, 44 episodes, fully scraped (Polish metadata + posters + backdrops + episode stills)
- **Theme**: ElegantFin v25.12.31 applied via `/System/Configuration/branding`
- **Subtitles**: OpenSubtitles plugin v20 installed; user must add opensubtitles.com creds (free tier = 20 dl/day)
- **Users**: 1 admin (`s8n`); friend account creation playbook in doc 04
Repo lives at <https://git.s8n.ru/s8n/ARRFLIX> (mirror:
<https://flexhub.s8n.ru/s8n/ARRFLIX>).
## Deploy
---
<p align="center"><sub>ARRFLIX — a one-person streaming service that punches above its weight.</sub></p>
```bash
cd /opt/docker/jellyfin
docker compose up -d
```

View file

@ -1,142 +1,107 @@
# Roadmap — ARRFLIX
Last revised: **2026-05-08**
What's done, what's open, what's deferred. Update on every commit that lands or
moves an item between buckets.
Last revised: 2026-05-08
---
## Snapshot
## Done
| Metric | Value |
|---|---|
| Prod URL | https://arrflix.s8n.ru → 302 ✓ |
| Dev URL | https://dev.arrflix.s8n.ru → 302 ✓ |
| Theme | **Cineplex v1.0.6** (rolled back from NeutralFin) |
| Repo | `git.s8n.ru/s8n/ARRFLIX` |
| Library | 6 series + 2 movies, 175 eps + 9 featurettes |
| Disk | nullstone /home — 156G free (60% used) |
| Users | 9 (1 admin + 8 non-admin) |
| Snapshot tag | `snapshot-2026-05-08-pre-elegantfin` (rollback) |
| Docs | 17 in `docs/` + ADMIN-GUIDE + ROADMAP |
- [x] **Deploy**: Jellyfin 10.10.3 on nullstone, LAN-only at `arrflix.s8n.ru`, file-provider Traefik route, LE cert via Gandi DNS-01, Pi-hole local DNS pin, userns_mode=host
- [x] **Theme**: ElegantFin v25.12.31 applied via `/System/Configuration/branding`
- [x] **Cast & Crew + Guest Stars**: hidden globally via CustomCss (`#castCollapsible, #guestCastCollapsible`)
- [x] **Library**: TV Shows → `/media/tv/Futurama (1999)/`, 72 eps + 9 featurettes, locked to TMDB 615
- [x] **Cleanup**: Polish set deleted, junk-stripped English set imported, source + staging deleted
- [x] **Plugins**: OpenSubtitles v20 installed (v21+ needs JF 10.11 ABI)
- [x] **Users**: `s8n` (admin), `guest` (non-admin, pw `123`)
- [x] **Wrapper**: `bin/add-jellyfin-user.sh` for canonical user creation
- [x] **Home layout**: My Media tile row dropped per user (resume / resumeaudio / nextup / latestmedia)
- [x] **Docs**: 0108 (artwork, metadata, subs, theming, file-structure, per-lib themes, cleanup, naming) + ADMIN-GUIDE.md
- [x] Imported: The Incredible Hulk (2008)
- [x] Imported: Idiocracy (2006)
- [x] Imported: American Dad! (2005) S01-S04 (58 eps)
---
## 🟥 Open — High value (do first)
## Open — actionable now
| # | Item | Effort | Blocker |
### High value
- [ ] **OpenSubtitles credentials**
- Owner signs up at opensubtitles.**com** (NOT .org)
- I POST creds to `/Plugins/<id>/Configuration` per [docs/03 § 3.4](docs/03-subtitles.md)
- Test by triggering subtitle search on one Futurama episode
- Free tier = 20 dl/day; full library will take ~3 days unless VIP
- [ ] **GPU transcode (nvidia driver)**
- GTX 1660 Ti present on nullstone, `nvidia-smi` fails — driver kernel module not loaded
- SecureBoot enabled → DKMS module signing required
- Steps in `README.md § Transcoding` and one earlier diagnosis turn
- Blocks: anyone watching on a low-power client (phone, fire TV) currently CPU-transcodes
- Estimated wall: 30 min + reboot (nullstone hosts traefik, forgejo, matrix — ~2 min downtime)
- [ ] **Loading-splash rebrand**
- Replace Jellyfin pre-bundle logo with `arrflix.s8n.ru` wordmark + 4-bar pulse spinner
- Approach: bind-mount patched `/jellyfin/jellyfin-web/index.html` per the plan in this session's history
- Doc to write: `docs/09-loading-splash.md` (pre-bundle vs CustomCss timing, regen-on-upgrade)
### Medium value
- [ ] **Extract `bin/cleanup-import.sh` and `bin/normalize.py`** from docs 07 + 08 into runnable repo files (currently embedded in markdown only)
- [ ] **Per-library themes (doc 06)**
- Install `n00bcodr/Jellyfin-JavaScript-Injector` plugin
- Ship 30-line shim that mirrors `topParentId` + `collectionType` from URL hash to body class
- Add three scoped CSS blocks (`body.lib-movies` Netflix, `body.lib-anime` Crunchyroll, `body.lib-music` Spotify)
- Source CSS hunt: 5 Netflix-flavoured bases listed in doc 06; Crunchyroll + Spotify must be hand-built (no existing theme)
- Verdict per doc 06: "tinted, branded, recognisable" — NOT pixel-perfect
- [ ] **Audit-vs-rules pass on current state**
- `/home/user/media/tv/Futurama (1999)/` already conforms post-import
- But: subtitle sidecars absent (waiting on OpenSubtitles creds)
- Featurettes folder is lowercase ✓
- Year in parens ✓
- SXXEXX zero-padded ✓
- Episode title separator ` - `
### Low value
- [ ] **Library scaffolding**: empty `/media/{movies,anime,musicvideos}/` libraries exist; no content yet
- [ ] **Backup strategy** for `/home/docker/jellyfin/config/` (DB + watched-state). Currently zero backups. Tie into existing nullstone backup chain.
- [ ] **Forgejo Actions CI** for the repo (lint compose, validate `bin/*.sh` with shellcheck, render docs)
---
## Blocked / waiting on owner
- OpenSubtitles creds → owner has not signed up yet
- nvidia driver fix → owner needs to run sudo commands or approve disable-SecureBoot path
- Decision on per-library themes (doc 06): green-light or skip
---
## Deferred
- **Pixel-perfect Netflix/Crunchyroll/Spotify per-library**: would require 3 separate Jellyfin instances on subdomains. ~100× maintenance cost. Doc 06 § 5. Don't do.
- **Custom Jellyfin Docker image**: `FROM jellyfin/jellyfin + COPY index.html`. Cleaner than bind-mount for splash + JS injector but extra build pipeline. Defer until ≥3 web-bundle overrides needed.
- **Subdomain split for friend-only access**: friend already gets non-admin Jellyfin user via `bin/add-jellyfin-user.sh` with `EnabledFolders` ACL. Subdomain not necessary.
- **Move to alternative web client (Jellyfin-Vue)**: replaces the whole UI, breaks ElegantFin + JS Injector. Owner explicitly wants Netflix-y, not vue-y. Don't do.
- **Hardware change**: 4 TB HDD on nullstone idle. Wait until library exceeds 500 GB before activating second-path library mounts (doc 05 § Architecture C).
---
## Tracking
When an item moves to **Done**, link the commit hash. When it stalls, note the blocker date. Don't let entries rot — review on the first of each month.
| Item | Status | Last touch | Owner |
|---|---|---|---|
| H1 | OpenSubtitles credentials (auth fixes log spam too — doc 13 win 2) | S | **owner signs up at opensubtitles.com** |
| H2 | GPU transcode (nvidia driver kernel module + container toolkit + SecureBoot signing) | L | **owner sudo + reboot** |
| H3 | Apply `bin/force-english-all-users.sh` (German Play button breaks UX for non-English browsers) | S | none — owner runs |
| H4 | Backup `/home/docker/jellyfin/config/` off-host (no automated backup yet) | M | strategy decision |
## 🟨 Open — Medium value
| # | Item | Effort | Notes |
|---|---|---|---|
| M1 | Tune detail-page backdrop gradient stops if text contrast off | S | doc 14 §7 |
| M2 | EnableThrottling + EnableSegmentDeletion (kills wasted ffmpeg-after-disconnect) | S | doc 13 win 1 |
| M3 | KnownProxies + LocalNetworkSubnets in network.xml (fixes session origin on WAN endpoint) | S | doc 13 win 3 |
| M4 | PWA manifest bind-mount — kills "Jellyfin" name on Android/iOS install | M | doc 16 phase 1 |
| M5 | Logo-screensaver disable + i18n DOM-rewrite shim | M | doc 16 phases 2+3 |
| M6 | Extract `bin/cleanup-import.sh` + `normalize.py` from doc bodies into runnable files | S | docs 07/08 |
| M7 | Per-library themes (JS injector plugin + body class shim) | M | doc 06 — "tinted, not pixel-perfect" |
## 🟩 Open — Low value (nice-to-have)
| # | Item | Effort | Notes |
|---|---|---|---|
| L1 | Forgejo Actions CI (lint compose, shellcheck bin/, render docs) | M | not started |
| L2 | High-res ARRFLIX wordmark for desktop splash variant (currently 235×85, looks soft on 1080p+) | S | doc 14 finding |
| L3 | Hide lone "User" h3 header above Sign Out (cosmetic) | S | open Q from settings-fix agent |
| L4 | Rotate dev admin password (currently same as prod for parity) | S | open Q from settings-fix agent |
---
## 🚫 Blocked / waiting
| Item | Blocker | Action owner |
|---|---|---|
| OpenSubtitles auth | account signup at .com | **s8n** |
| Nvidia GPU | sudo + reboot decision | **s8n** |
| WAN public access | home router port-forward 80/443 → 192.168.0.100 | **s8n** |
---
## 🔒 Deferred (with reason)
| Item | Reason |
|---|---|
| Pixel-perfect Netflix/Crunchyroll/Spotify per-lib themes | requires 3 separate Jellyfin instances on subdomains; ~100× maintenance cost. Doc 06 |
| Custom Jellyfin Docker image (FROM jellyfin + COPY index.html) | bind-mount works; defer until ≥3 web-bundle overrides needed |
| Subdomain split for friend-only access | non-admin user policies + EnabledFolders ACL already do this on a single instance |
| Move to Jellyfin-Vue alt web client | replaces UI, breaks current branding stack |
| 4 TB HDD activation | wait until library exceeds 500 GB; currently 50G |
---
## ✅ Done
### Branding + theme
- ✅ Theme: ElegantFin → Cineplex → ElegantFin → NeutralFin → **Cineplex v1.0.6 (final)**, snapshot tag for rollback
- ✅ ARRFLIX logo data-URL injected — overrides Cineplex's logo on `.adminDrawerLogo img` + `.pageTitleWithLogo` (split-rule per element type, no overlap)
- ✅ Browser tab title `ARRFLIX` + favicon = ARRFLIX wordmark (via index.html bind-mount)
- ✅ Pre-bundle splash → ARRFLIX wordmark (no more Jellyfin logo on first paint)
- ✅ LoginDisclaimer "Welcome to ARRFLIX - Private invite only service"
- ✅ Critical-path inline `<style>` in index.html eliminates pre-bundle theme flash
- ✅ JS shim in index.html: title-lock + favicon-lock + nukeSettings + SW unregister
- ✅ Detail-page backdrop full-bleed gradient fix (was 17vw black band; now Netflix-style)
### UI hides + tweaks (CSS in CustomCss)
- ✅ Cast & Crew + Guest Stars sections (`#castCollapsible, #guestCastCollapsible`)
- ✅ Quick Connect button + server-side disable (`.btnQuick`, `QuickConnectAvailable=false`)
- ✅ Settings drawer link v2 (`a.btnSettings, [data-itemid="settings"]` — verified on dev with headless A/B before swap)
- ✅ Header icons: SyncPlay group, Cast, User menu (`.headerSyncButton`, `.headerCastButton`, `.headerUserButton`)
- ✅ Unwatched-count badges (`.countIndicator`)
- ✅ Settings menu page access (`EnableUserPreferenceAccess=false` per non-admin)
- ✅ Slider thumbs blue → white (scrubber + volume on player OSD)
- ✅ Pure-black background
### Library
- ✅ Cleanup playbook: 17-doc set including pre-import strip rules + filename normalization
- ✅ Imports applied via cleanup → normalize pipeline:
- Futurama (1999) S01S04, 72 eps + 9 featurettes (TMDB 615)
- American Dad! (2005) S01S04, 58 eps (TMDB 1433)
- Rick and Morty (2013) S01, 11 eps (TMDB 60625)
- Star Wars: Maul Shadow Lord (2026) S01, 10 eps (TMDB 289219)
- Obi-Wan Kenobi (2022) S01, 6 eps + 4 featurettes (TMDB 92830)
- The Incredible Hulk (2008) (TMDB 1724)
- Idiocracy (2006) (TMDB 7512)
- ⏳ The Mandalorian (2019) S01S03 — 18/24 mkv on disk, scrape in flight
- ✅ Futurama season posters re-locked to highest-res TMDB (was low-res)
- ✅ Polish set replaced with English; libraries flipped `pl/PL``en/US`
### Users + access
- ✅ 9 users (`s8n` admin, `5`, `64bitpotato`, `aloy`, `guest`, `house`, `marco`, `pet`, `yummyhunny`)
- ✅ All non-admin policies: `IsAdministrator=false`, `EnableContentDeletion=false`, `EnableUserPreferenceAccess=false`, `LoginAttemptsBeforeLockout=5`
- ✅ Wrapper `bin/add-jellyfin-user.sh` — single-call canonical user creation (4-step pipeline: create + home layout + lang prefs + restricted policy)
- ✅ Home layout per-user: resume → resumeaudio → nextup → latestmedia (My Media tile row dropped)
### Infra
- ✅ Domain rename: `tv.s8n.ru``nasflix.s8n.ru`**`arrflix.s8n.ru`**
- ✅ Repo rename: `jellyfin-stack``NASFLIX`**`ARRFLIX`** at `git.s8n.ru/s8n/ARRFLIX`
- ✅ Pi-hole local DNS for `arrflix.s8n.ru` + `dev.arrflix.s8n.ru`
- ✅ LE certs via Gandi DNS-01 for both prod + dev
- ✅ WAN window: Gandi public A record `arrflix.s8n.ru → 82.31.156.86`, no-guest middleware dropped, lockout=5 baked in (router port-forward pending)
- ✅ Dev instance: `dev.arrflix.s8n.ru`, isolated config, shared `/home/user/media:/media:ro` mount with prod (read-only), 7 mirror users + s8n-dev admin
- ✅ Snapshot tag `snapshot-2026-05-08-pre-elegantfin` for one-command rollback
### Docs (17 + 2 indexes + bin/)
- ✅ `ADMIN-GUIDE.md` (entry point)
- ✅ `ROADMAP.md` (this file)
- ✅ `docs/01..16` covering: artwork, metadata, subtitles, theming-and-users, file-structure, per-library-themes, cleanup, normalization, WAN exposure, SPA shim, NeutralFin audit, dev instance, optimization audit, theme audit, force English, Jellyfin-branding leaks, dev mirror + settings fix
- ✅ `bin/add-jellyfin-user.sh`, `bin/inject-shim.py`, `bin/force-english-all-users.sh`
---
## Conventions
When marking an item:
- Move **Open****Done** when shipped + verified
- Move to **Blocked** when waiting on owner / external
- Move to **Deferred** with one-line reason
- Update **Snapshot** stats on every revision
| OpenSubtitles creds | blocked-on-owner | 2026-05-07 | s8n |
| nvidia driver | blocked-on-owner | 2026-05-07 | s8n |
| Loading splash | open-actionable | 2026-05-08 | claude |
| Extract bin/ scripts | open-actionable | 2026-05-08 | claude |
| Per-library themes | open-actionable (decision pending) | 2026-05-08 | claude |
| Library scaffolding | open-low-value | 2026-05-08 | s8n |
| Backup strategy | open-low-value | not-started | claude |
| Forgejo CI | open-low-value | not-started | claude |

Binary file not shown.

Before

Width:  |  Height:  |  Size: 116 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 2.4 MiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 980 KiB

View file

@ -71,7 +71,7 @@ rm -f /tmp/dp-cur.$$.json /tmp/dp-fix.$$.json
[[ "$HTTP" == "204" ]] || { echo " DisplayPreferences POST failed: $HTTP"; exit 1; }
echo " Home layout applied."
echo "[3/4] Setting language prefs (force English everywhere, no fallback)..."
echo "[3/4] Setting language prefs (audio=eng, subs=eng default)..."
curl -ks "$JELLYFIN_URL/Users/$USER_ID" -H "Authorization: $AUTH" > /tmp/u.$$.json
python3 - <<EOF > /tmp/u-fix.$$.json
import json
@ -81,8 +81,6 @@ c['SubtitleMode'] = 'Default'
c['SubtitleLanguagePreference'] = 'eng'
c['AudioLanguagePreference'] = 'eng'
c['PlayDefaultAudioTrack'] = True
c['UICulture'] = 'en-US'
c['DisplayMissingEpisodes'] = False
print(json.dumps(c))
EOF
HTTP=$(curl -ks -X POST "$JELLYFIN_URL/Users/$USER_ID/Configuration" \

View file

@ -1,181 +0,0 @@
#!/usr/bin/env bash
# apply-26-incident-fixes.sh
#
# Re-applies the three server-state fixes from docs/26 if branding.xml /
# encoding.xml drift back to broken state (e.g. after a Jellyfin restore).
#
# 1. CustomCss: Cineplex hardcoded "Abspielen" → "Play"
# 2. CustomCss: Backdrop transparent-scope using :has() (BLACK-PASS occluded backdrop layer)
# 3. encoding.xml: EnableThrottling=false + EnableSegmentDeletion=false (kills HLS 499)
#
# Usage: ssh user@nullstone "$(cat bin/apply-26-incident-fixes.sh)"
# Idempotent: re-running is safe.
set -euo pipefail
# 3+5. encoding.xml — disable throttling + segment deletion (HLS 499)
# AND disable software tonemapping (CPU-only nullstone
# cannot sustain real-time 4K HDR tonemap+x264, ffmpeg
# runs at ~0.5x → 18s wait time before video starts;
# R&M is fake-HDR per doc 21 anyway, so no visual loss)
for cfg in /home/docker/jellyfin/config/config/encoding.xml \
/home/docker/jellyfin-dev/config/config/encoding.xml; do
[ -f "$cfg" ] || continue
cp -n "$cfg" "$cfg.bak.pre-doc26" || true
sed -i \
-e 's|<EnableThrottling>true</EnableThrottling>|<EnableThrottling>false</EnableThrottling>|' \
-e 's|<EnableSegmentDeletion>true</EnableSegmentDeletion>|<EnableSegmentDeletion>false</EnableSegmentDeletion>|' \
-e 's|<EnableTonemapping>true|<EnableTonemapping>false|' \
-e 's|<EnableVppTonemapping>true|<EnableVppTonemapping>false|' \
"$cfg"
echo "[+] patched $cfg"
done
# 1+2. branding.xml CustomCss — Abspielen + backdrop transparent-scope
patch_branding() {
local cfg="$1"
[ -f "$cfg" ] || return 0
if grep -q "ARRFLIX 2026-05-09" "$cfg"; then
echo "[=] $cfg already has doc-26 patch"
return 0
fi
cp -n "$cfg" "$cfg.bak.pre-doc26" || true
python3 - <<PY
p = "$cfg"
s = open(p).read()
patch = """
/* ARRFLIX 2026-05-09 — incident fixes (see docs/26-incident-2026-05-09-...).
INC1: Cineplex theme hardcodes German "Abspielen" via content: ::after.
INC1: BLACK-PASS occludes backdrop; transparent-scope via :has().
INC2: pin backdrop position:fixed so it persists across scroll.
INC3: extend transparent-scope through detail-page sub-sections so
section wrappers don't paint over the pinned backdrop.
INC4: override the 2026-05-08 .emby-scroller=#000 rule on detail page
(it was painting a black band behind every carousel — most visible
on admin-only "More from Season" / "More Like This"). */
.mainDetailButtons .material-icons.play_arrow::after {
content: "Play" !important;
}
.itemDetailPage,
.layout-desktop:has(.itemDetailPage),
.layout-mobile:has(.itemDetailPage),
.layout-tv:has(.itemDetailPage),
.mainAnimatedPages:has(.itemDetailPage),
.pageContainer:has(.itemDetailPage),
.padded-bottom-page:has(.itemDetailPage),
.libraryPage:has(.itemDetailPage),
.absolutePageTabContent:has(.itemDetailPage) {
background-color: transparent !important;
background: transparent !important;
}
.layout-desktop .backdropContainer,
.layout-mobile .backdropContainer,
.layout-tv .backdropContainer,
.layout-desktop .backgroundContainer,
.layout-mobile .backgroundContainer,
.layout-tv .backgroundContainer {
position: fixed !important;
top: 0 !important;
left: 0 !important;
width: 100vw !important;
height: 100vh !important;
z-index: 0 !important;
}
.layout-desktop .backgroundContainer.withBackdrop::after,
.layout-mobile .backgroundContainer.withBackdrop::after,
.layout-tv .backgroundContainer.withBackdrop::after {
content: "";
position: absolute;
inset: 0;
background: linear-gradient(
180deg,
rgba(0,0,0,0.00) 0%,
rgba(0,0,0,0.00) 35%,
rgba(0,0,0,0.40) 70%,
rgba(0,0,0,0.75) 100%
);
pointer-events: none;
z-index: 1;
}
.itemDetailPage,
.itemDetailPage > *,
.detailPageContent,
.detailPagePrimaryContainer,
.detailPageWrapperContainer,
.detailPageContent > *,
.detailVerticalSection,
.detailVerticalSection-extrabottompadding,
.detailSection,
.detailSectionContent,
.itemsContainer,
.scrollSlider,
.scrollSliderContainer,
.padded-bottom-page,
.detailPagePrimaryContent,
.sectionTitleContainer,
.detailRibbon,
.subtitleAudioContainer,
.detailPageRoot {
background-color: transparent !important;
background: transparent !important;
}
/* INC4: 2026-05-08 home-page "kill gray band" rule paints .emby-scroller
#000 unscoped — that's the OPAQUE wrapper around every carousel inside
.itemDetailPage. Override back to transparent on detail page only. */
.itemDetailPage .emby-scroller,
.itemDetailPage .emby-scroller-container,
.itemDetailPage .verticalSection,
.itemDetailPage .padded-top-focusscale,
.itemDetailPage .padded-bottom-focusscale,
.itemDetailPage .moreFromSeasonSection,
.itemDetailPage .moreFromArtistSection,
.itemDetailPage .scrollSliderContainer,
.itemDetailPage .scrollButtonContainer {
background-color: transparent !important;
background: transparent !important;
}
/* INC7 2026-05-09: BLACK-PASS paints .libraryPage #000; #videoOsdPage uses
that class so the OSD page covers <video> with opaque black. <video>
decodes frames (canvas drawImage luma=84) but visually 100% black until
we exempt the OSD page from BLACK-PASS via :has(.htmlVideoPlayer). */
.libraryPage:has(.htmlVideoPlayer),
.libraryPage#videoOsdPage,
#videoOsdPage,
#videoOsdPage .pageContainer,
#videoOsdPage .layout-desktop,
#videoOsdPage .mainAnimatedPages {
background-color: transparent !important;
background: transparent !important;
}
/* INC5: kill grey scrollbar groove at page bottom (Chrome native scrollbar
default = grey track; appears as ~15px strip at viewport bottom). Style
all scrollbars to ARRFLIX palette. */
*::-webkit-scrollbar {
background: #000000 !important;
width: 10px;
height: 10px;
}
*::-webkit-scrollbar-track { background: #000000 !important; }
*::-webkit-scrollbar-thumb {
background: #2a2a2a !important;
border-radius: 5px;
}
*::-webkit-scrollbar-thumb:hover { background: #3a3a3a !important; }
*::-webkit-scrollbar-corner { background: #000000 !important; }
* { scrollbar-color: #2a2a2a #000000; }
html, body { scrollbar-color: #2a2a2a #000000; }
"""
s = s.replace("</CustomCss>", patch + "</CustomCss>")
open(p, "w").write(s)
PY
echo "[+] patched $cfg"
}
patch_branding /home/docker/jellyfin/config/config/branding.xml
patch_branding /home/docker/jellyfin-dev/config/config/branding.xml
# Restart so changes take effect
docker restart jellyfin jellyfin-dev 2>/dev/null || docker restart jellyfin
echo "[*] Done. Verify with bin/headless-test.py."

View file

@ -1,202 +0,0 @@
#!/usr/bin/env bash
# english-lockdown-runner.sh — idempotent re-apply of the ARRFLIX English-only lockdown.
#
# See docs/20-english-only-lockdown.md for the full design, layer breakdown,
# and drift-check procedure. This script handles two of the three layers:
#
# 1. Server-wide: UICulture / PreferredMetadataLanguage / MetadataCountryCode
# via POST /System/Configuration.
# 2. Per-user: UICulture / AudioLanguagePreference / SubtitleLanguagePreference /
# PlayDefaultAudioTrack via POST /Users/{id}/Configuration for every account.
#
# The third layer (web SPA shim — navigator.language override + language-switcher
# CSS hide) is served via the bind-mounted web-overrides/ tree; nothing for
# this script to push.
#
# Idempotent — running it twice produces the same end state. Each layer is
# read, merged with English defaults, and POSTed back. Skips writes when the
# server already matches.
#
# Usage:
# JELLYFIN_API_TOKEN=<admin-token> ./english-lockdown-runner.sh
#
# Optional env:
# JELLYFIN_URL default https://arrflix.s8n.ru
# DRY_RUN default unset; set DRY_RUN=1 to print payloads without POSTing
#
# Exit codes:
# 0 every layer landed (or already correct)
# 1 at least one POST failed; check stderr/stdout for which surface
# 2 bad invocation (missing required env)
#
# Token rotation note: the API token has full admin scope. Use a dedicated
# token, not a personal-account session token, and rotate after offboarding
# any operator with shell access to the host running this script.
set -euo pipefail
JELLYFIN_URL="${JELLYFIN_URL:-https://arrflix.s8n.ru}"
JELLYFIN_API_TOKEN="${JELLYFIN_API_TOKEN:?set JELLYFIN_API_TOKEN=<admin-token>; aborting (see docs/20-english-only-lockdown.md)}"
DRY_RUN="${DRY_RUN:-}"
AUTH="MediaBrowser Token=$JELLYFIN_API_TOKEN"
# Server-wide targets
SERVER_UI_CULTURE="en-US"
SERVER_METADATA_LANG="en"
SERVER_METADATA_COUNTRY="US"
# Per-user targets
USER_UI_CULTURE="en-US"
USER_AUDIO_LANG="eng"
USER_SUBTITLE_LANG="eng"
USER_PLAY_DEFAULT_AUDIO="true"
FAIL_COUNT=0
# ---------------------------------------------------------------------------
# Layer 1: server-wide config
# ---------------------------------------------------------------------------
echo "[*] Layer 1: server-wide /System/Configuration"
SERVER_TMP_IN=$(mktemp)
SERVER_TMP_OUT=$(mktemp)
trap 'rm -f "$SERVER_TMP_IN" "$SERVER_TMP_OUT"' EXIT
curl -ks "$JELLYFIN_URL/System/Configuration" -H "Authorization: $AUTH" > "$SERVER_TMP_IN"
CURRENT_SERVER=$(python3 -c "
import json
with open('$SERVER_TMP_IN') as f: c = json.load(f)
print(f\"UICulture={c.get('UICulture','<absent>')} PreferredMetadataLanguage={c.get('PreferredMetadataLanguage','<absent>')} MetadataCountryCode={c.get('MetadataCountryCode','<absent>')}\")
")
echo " before: $CURRENT_SERVER"
NEEDS_SERVER_WRITE=$(python3 -c "
import json
with open('$SERVER_TMP_IN') as f: c = json.load(f)
ok = (
c.get('UICulture') == '$SERVER_UI_CULTURE'
and c.get('PreferredMetadataLanguage') == '$SERVER_METADATA_LANG'
and c.get('MetadataCountryCode') == '$SERVER_METADATA_COUNTRY'
)
print('0' if ok else '1')
")
if [[ "$NEEDS_SERVER_WRITE" == "0" ]]; then
echo " ok: server already pinned, skipping write"
else
python3 - <<PYEOF > "$SERVER_TMP_OUT"
import json
with open("$SERVER_TMP_IN") as f: c = json.load(f)
c["UICulture"] = "$SERVER_UI_CULTURE"
c["PreferredMetadataLanguage"] = "$SERVER_METADATA_LANG"
c["MetadataCountryCode"] = "$SERVER_METADATA_COUNTRY"
print(json.dumps(c))
PYEOF
if [[ -n "$DRY_RUN" ]]; then
echo " DRY_RUN: would POST $(wc -c < "$SERVER_TMP_OUT") bytes to /System/Configuration"
else
HTTP=$(curl -ks -X POST "$JELLYFIN_URL/System/Configuration" \
-H "Authorization: $AUTH" \
-H "Content-Type: application/json" \
--data-binary @"$SERVER_TMP_OUT" -w "%{http_code}" -o /dev/null)
if [[ "$HTTP" == "204" || "$HTTP" == "200" ]]; then
echo " after: UICulture=$SERVER_UI_CULTURE PreferredMetadataLanguage=$SERVER_METADATA_LANG MetadataCountryCode=$SERVER_METADATA_COUNTRY (HTTP $HTTP)"
else
echo " ERROR: POST /System/Configuration returned HTTP $HTTP" >&2
FAIL_COUNT=$((FAIL_COUNT + 1))
fi
fi
fi
echo
# ---------------------------------------------------------------------------
# Layer 2: per-user config
# ---------------------------------------------------------------------------
echo "[*] Layer 2: per-user /Users/{id}/Configuration"
USERS_JSON=$(curl -ks "$JELLYFIN_URL/Users" -H "Authorization: $AUTH")
USER_COUNT=$(echo "$USERS_JSON" | python3 -c "import json,sys; print(len(json.load(sys.stdin)))")
echo " $USER_COUNT users found."
echo
# Process-substitution to keep `set -e` semantics in the loop body.
while IFS=$'\t' read -r USER_ID USER_NAME OLD_UI OLD_AUDIO OLD_SUB OLD_PLAY; do
TMP_IN=$(mktemp)
TMP_OUT=$(mktemp)
curl -ks "$JELLYFIN_URL/Users/$USER_ID" -H "Authorization: $AUTH" > "$TMP_IN"
NEEDS_USER_WRITE=$(python3 -c "
import json
with open('$TMP_IN') as f: u = json.load(f)
c = u.get('Configuration', {})
ok = (
c.get('UICulture') == '$USER_UI_CULTURE'
and c.get('AudioLanguagePreference') == '$USER_AUDIO_LANG'
and c.get('SubtitleLanguagePreference') == '$USER_SUBTITLE_LANG'
and c.get('PlayDefaultAudioTrack') is True
)
print('0' if ok else '1')
")
if [[ "$NEEDS_USER_WRITE" == "0" ]]; then
echo " [ok] $USER_NAME ($USER_ID) — already pinned"
rm -f "$TMP_IN" "$TMP_OUT"
continue
fi
python3 - <<PYEOF > "$TMP_OUT"
import json
with open("$TMP_IN") as f: u = json.load(f)
c = u["Configuration"]
c["UICulture"] = "$USER_UI_CULTURE"
c["AudioLanguagePreference"] = "$USER_AUDIO_LANG"
c["SubtitleLanguagePreference"] = "$USER_SUBTITLE_LANG"
c["PlayDefaultAudioTrack"] = True
print(json.dumps(c))
PYEOF
if [[ -n "$DRY_RUN" ]]; then
echo " [dry] $USER_NAME ($USER_ID) — would POST $(wc -c < "$TMP_OUT") bytes"
else
HTTP=$(curl -ks -X POST "$JELLYFIN_URL/Users/$USER_ID/Configuration" \
-H "Authorization: $AUTH" \
-H "Content-Type: application/json" \
--data-binary @"$TMP_OUT" -w "%{http_code}" -o /dev/null)
if [[ "$HTTP" == "204" || "$HTTP" == "200" ]]; then
echo " [pin] $USER_NAME ($USER_ID) — UICulture=$USER_UI_CULTURE Audio=$USER_AUDIO_LANG Sub=$USER_SUBTITLE_LANG PlayDefault=true (HTTP $HTTP)"
else
echo " [FAIL] $USER_NAME ($USER_ID) — HTTP $HTTP" >&2
FAIL_COUNT=$((FAIL_COUNT + 1))
fi
fi
rm -f "$TMP_IN" "$TMP_OUT"
done < <(echo "$USERS_JSON" | python3 -c "
import json, sys
for u in json.load(sys.stdin):
c = u.get('Configuration', {})
print('\t'.join([
u['Id'],
u['Name'],
str(c.get('UICulture', '')),
str(c.get('AudioLanguagePreference', '')),
str(c.get('SubtitleLanguagePreference', '')),
str(c.get('PlayDefaultAudioTrack', '')),
]))
")
echo
# ---------------------------------------------------------------------------
# Summary + exit
# ---------------------------------------------------------------------------
if [[ $FAIL_COUNT -eq 0 ]]; then
echo "[*] Done. All layers pinned (or already correct). Drift-check commands"
echo " in docs/20-english-only-lockdown.md."
exit 0
else
echo "[!] Done with $FAIL_COUNT failure(s). Re-run after investigating;"
echo " drift-check commands in docs/20-english-only-lockdown.md." >&2
exit 1
fi

View file

@ -1,87 +0,0 @@
#!/usr/bin/env bash
# force-english-all-users.sh — pin Configuration.UICulture=en-US on every Jellyfin user.
#
# Why this exists: see docs/15-force-english.md.
# TL;DR — when a user has UICulture unset, the Jellyfin web SPA falls back to
# browser Accept-Language. Owner saw "Abspielen" (German "Play") on a Play
# button because someone's browser sends de-*. Pinning UICulture per user
# overrides Accept-Language and gives every account English UI regardless
# of where they log in from.
#
# Read-modify-write on /Users/{id}/Configuration. Idempotent — running it
# twice produces the same end state. Prints before/after UICulture per user.
#
# Usage:
# JELLYFIN_TOKEN=<admin-token> ./force-english-all-users.sh
#
# Optional env:
# JELLYFIN_URL default https://arrflix.s8n.ru
# TARGET_LOCALE default en-US (e.g. en-GB also works)
# DRY_RUN default unset; set DRY_RUN=1 to print payloads without POSTing
set -euo pipefail
JELLYFIN_URL="${JELLYFIN_URL:-https://arrflix.s8n.ru}"
JELLYFIN_TOKEN="${JELLYFIN_TOKEN:?set JELLYFIN_TOKEN=<admin-token>}"
TARGET_LOCALE="${TARGET_LOCALE:-en-US}"
DRY_RUN="${DRY_RUN:-}"
AUTH="MediaBrowser Token=$JELLYFIN_TOKEN"
echo "[*] Listing users..."
USERS_JSON=$(curl -ks "$JELLYFIN_URL/Users" -H "Authorization: $AUTH")
COUNT=$(echo "$USERS_JSON" | python3 -c "import json,sys; print(len(json.load(sys.stdin)))")
echo " $COUNT users found."
echo
# Iterate. Pipe-to-while loses set -e on subshell exit, so use process-substitution.
while IFS=$'\t' read -r USER_ID USER_NAME OLD_CULTURE; do
echo "[*] $USER_NAME ($USER_ID)"
echo " before: UICulture=${OLD_CULTURE:-<absent>}"
if [[ "$OLD_CULTURE" == "$TARGET_LOCALE" ]]; then
echo " skip: already $TARGET_LOCALE"
echo
continue
fi
TMP_IN=$(mktemp)
TMP_OUT=$(mktemp)
curl -ks "$JELLYFIN_URL/Users/$USER_ID" -H "Authorization: $AUTH" > "$TMP_IN"
python3 - <<PYEOF > "$TMP_OUT"
import json
with open("$TMP_IN") as f: u = json.load(f)
c = u["Configuration"]
c["UICulture"] = "$TARGET_LOCALE"
print(json.dumps(c))
PYEOF
if [[ -n "$DRY_RUN" ]]; then
echo " DRY_RUN: would POST $(wc -c < "$TMP_OUT") bytes to /Users/$USER_ID/Configuration"
else
HTTP=$(curl -ks -X POST "$JELLYFIN_URL/Users/$USER_ID/Configuration" \
-H "Authorization: $AUTH" \
-H "Content-Type: application/json" \
--data-binary @"$TMP_OUT" -w "%{http_code}" -o /dev/null)
if [[ "$HTTP" != "204" ]]; then
echo " ERROR: POST returned HTTP $HTTP"
rm -f "$TMP_IN" "$TMP_OUT"
exit 1
fi
# Verify
NEW_CULTURE=$(curl -ks "$JELLYFIN_URL/Users/$USER_ID" -H "Authorization: $AUTH" \
| python3 -c "import json,sys; print(json.load(sys.stdin)['Configuration'].get('UICulture','<absent>'))")
echo " after: UICulture=$NEW_CULTURE"
fi
rm -f "$TMP_IN" "$TMP_OUT"
echo
done < <(echo "$USERS_JSON" | python3 -c "
import json, sys
for u in json.load(sys.stdin):
cur = u.get('Configuration', {}).get('UICulture', '')
print(f\"{u['Id']}\t{u['Name']}\t{cur}\")
")
echo "[*] Done. Tell users to hard-refresh (Ctrl-Shift-R) so the SPA reloads"
echo " the locale bundle. Verify on a movie detail page — Play button"
echo " should read 'Play', not 'Abspielen'."

View file

@ -1,629 +0,0 @@
#!/usr/bin/env python3
"""ARRFLIX headless smoke-test v2.
Why v2 exists (see docs/26 INC4 audit):
v1 had three coverage gaps that let two regressions ship:
- Logged in only as `guest` (non-admin restricted) admin-only sections
like the "More from Season N" carousel never rendered, so the black
band behind that carousel was invisible to the test.
- Never clicked Play never observed the <video> element in a real
playback state, so AV1+Opus episodes silently rendering black went
undetected.
- Probed only a hardcoded selector list any element painting an
opaque background outside that list (e.g. a new section wrapper)
was never reported.
v2 closes those gaps:
1. Multi-user runs: executes the full probe as BOTH admin and non-admin
in the same invocation, writes per-user JSON + screenshots, and
reports a DOM-section diff (sections present for one user but not
the other admin-only-visible content).
2. Click Play: locates the play button, clicks it, waits 10 s, captures
<video> element state (currentTime, paused, error, readyState, dims),
plus a video-area screenshot and any new console / network errors.
3. Multiple-item coverage: walks an item list (default: HEVC movie + AV1
TV episode + H.264 TV episode if available) and runs the full
detail-page + play probe for each.
4. Section-bg sweep: at scroll-bottom, walks every visible element and
reports any with a non-transparent backgroundColor whose bounding rect
overlaps where the pinned backdrop should be visible. Output goes
into probe.json under "regressions" with an allowlist filter.
5. Golden-screenshot diff: if a known-good screenshot exists at
OUT/golden/<key>.png, the run computes a Pillow pixel diff and writes
<key>-diff.png + a numeric mismatch ratio.
6. Structured JSON: probe.json now has top-level shape
{url, runs:[{user, item, item_kind, probe, play, regressions, ...}]}
so downstream tooling (CI / agents) can parse without grepping.
Usage:
bin/headless-test-v2.py [URL] [OUT_DIR]
URL defaults to https://dev.arrflix.s8n.ru.
OUT_DIR defaults to /tmp/arrflix-headless-v2.
User credentials are determined automatically from URL:
arrflix.s8n.ru admin=s8n / guest=guest
dev.arrflix.s8n.ru admin=s8n-dev / guest=guest-mirror
Override via env vars:
ADMIN_USER, ADMIN_PASS, GUEST_USER, GUEST_PASS
ITEMS=id1,id2,id3 # override default item list
Default items (chosen for codec coverage):
- HEVC movie: 7aa5add2c2d8575eda5280b9b9072071 (The Dark Knight)
- AV1 episode: auto-pick first Mike Nolan Show episode
- H.264 episode: auto-pick first non-AV1 episode if available
Exit codes:
0 all runs succeeded, no playback errors, no regression bg elements
1 setup / login failure
2 one or more runs reported playback failure or unallowlisted bg regression
"""
import sys, json, time, os, asyncio, urllib.request, urllib.error, ssl
from pathlib import Path
from playwright.async_api import async_playwright
try:
from PIL import Image, ImageChops
PIL_OK = True
except ImportError:
PIL_OK = False
URL = sys.argv[1] if len(sys.argv) > 1 else "https://dev.arrflix.s8n.ru"
OUT = sys.argv[2] if len(sys.argv) > 2 else "/tmp/arrflix-headless-v2"
os.makedirs(OUT, exist_ok=True)
os.makedirs(os.path.join(OUT, "golden"), exist_ok=True)
# Default credentials by env (URL → admin/guest)
if "dev.arrflix.s8n.ru" in URL:
DEFAULT_ADMIN = ("s8n-dev", "2001dude")
DEFAULT_GUEST = ("guest-mirror", "dev-test-guest")
else:
DEFAULT_ADMIN = ("s8n", "2001dude")
DEFAULT_GUEST = ("guest", "123")
ADMIN_USER = os.environ.get("ADMIN_USER", DEFAULT_ADMIN[0])
ADMIN_PASS = os.environ.get("ADMIN_PASS", DEFAULT_ADMIN[1])
GUEST_USER = os.environ.get("GUEST_USER", DEFAULT_GUEST[0])
GUEST_PASS = os.environ.get("GUEST_PASS", DEFAULT_GUEST[1])
# Default items: HEVC movie known id; TV episodes auto-picked per-user
ITEMS_OVERRIDE = os.environ.get("ITEMS", "").strip()
DEFAULT_HEVC_MOVIE = "7aa5add2c2d8575eda5280b9b9072071" # Dark Knight
MNS_NEEDLE = "mike nolan" # case-insensitive substring of series name for AV1 lookup
DEVICE = "headless-test-v2"
DEVICE_ID = "headless-test-v2-2026-05-09"
CLIENT = "HeadlessV2"
VERSION = "2.0"
# Selectors known to legitimately paint solid bg over backdrop area; if a
# regression sweep finds a bg element NOT on this list overlapping the
# backdrop region, it is flagged. Update intentionally as design changes.
BG_ALLOWLIST = {
# OSD / video player overlays — fine to be opaque
".htmlVideoPlayer", ".videoPlayerContainer", ".osdContent",
".upNextDialog", ".dialogContainer", ".dialog",
# Modal / dialog scrim layers
".dialogBackdrop", ".paperList",
# Top app drawer (intentionally opaque)
".skinHeader", ".headerTop",
}
# ---------- HTTP helpers (raw API) ----------
def auth_header(token=None):
h = (f'MediaBrowser Client="{CLIENT}", Device="{DEVICE}", '
f'DeviceId="{DEVICE_ID}", Version="{VERSION}"')
if token:
h += f', Token="{token}"'
return h
def _req(path, method="GET", body=None, token=None):
data = json.dumps(body).encode() if body is not None else None
req = urllib.request.Request(
f"{URL}{path}",
data=data,
headers={
"Authorization": auth_header(token),
"Content-Type": "application/json",
},
method=method,
)
ctx = ssl._create_unverified_context()
with urllib.request.urlopen(req, context=ctx, timeout=15) as r:
raw = r.read()
return json.loads(raw) if raw else {}
def login(user, password):
return _req("/Users/AuthenticateByName", "POST",
{"Username": user, "Pw": password})
def find_av1_episode(token, user_id):
"""Find first episode of Mike Nolan Show (or any series matching needle)."""
series = _req(
f"/Users/{user_id}/Items?Recursive=true&IncludeItemTypes=Series&Limit=200",
token=token)
target = None
for s in series.get("Items", []):
if MNS_NEEDLE in s.get("Name", "").lower():
target = s
break
if not target:
return None, None
eps = _req(
f"/Shows/{target['Id']}/Episodes?UserId={user_id}&Limit=1",
token=token)
if eps.get("Items"):
return eps["Items"][0]["Id"], f"{target['Name']} - {eps['Items'][0].get('Name','?')}"
return None, None
def find_h264_episode(token, user_id, exclude_series_id=None):
"""Auto-pick first episode of any TV series other than the AV1 one."""
series = _req(
f"/Users/{user_id}/Items?Recursive=true&IncludeItemTypes=Series&Limit=50",
token=token)
for s in series.get("Items", []):
if exclude_series_id and s.get("Id") == exclude_series_id:
continue
if MNS_NEEDLE in s.get("Name", "").lower():
continue
eps = _req(
f"/Shows/{s['Id']}/Episodes?UserId={user_id}&Limit=1",
token=token)
if eps.get("Items"):
return eps["Items"][0]["Id"], f"{s['Name']} - {eps['Items'][0].get('Name','?')}"
return None, None
def resolve_items(token, user_id):
"""Return list of [(item_id, label, kind), ...]."""
if ITEMS_OVERRIDE:
return [(i.strip(), f"override-{n}", "override")
for n, i in enumerate(ITEMS_OVERRIDE.split(",")) if i.strip()]
out = []
# HEVC movie (fixed id)
out.append((DEFAULT_HEVC_MOVIE, "Dark Knight (HEVC movie)", "hevc-movie"))
# AV1 episode (auto)
av1_id, av1_label = find_av1_episode(token, user_id)
if av1_id:
out.append((av1_id, f"{av1_label} (AV1 ep)", "av1-episode"))
# H.264 episode (auto, different series from AV1)
series_id_excl = None
if av1_id:
try:
ep = _req(f"/Users/{user_id}/Items/{av1_id}", token=token)
series_id_excl = ep.get("SeriesId")
except Exception:
pass
h264_id, h264_label = find_h264_episode(token, user_id, series_id_excl)
if h264_id:
out.append((h264_id, f"{h264_label} (H.264 ep)", "h264-episode"))
return out
# ---------- Playwright probe ----------
PROBE_SELECTORS = [
".itemBackdrop", ".detailBackdrop", ".backdropContainer",
".backgroundContainer", ".layout-desktop",
"body", "#reactRoot", ".itemDetailPage",
"video", ".htmlvideoplayer", ".btnPlay",
".detailPagePrimaryContainer", ".detailSection", ".detailVerticalSection",
".itemsContainer", ".padded-bottom-page", ".mainAnimatedPages",
".pageContainer", ".cardScalable", ".scrollSlider",
".sectionTitleContainer", ".detailPageContent", ".detailPageWrapperContainer",
".moreFromSeason", ".moreFromSeasonContainer", # admin-only carousel
]
async def probe_dom(page):
return await page.evaluate(
"""(SEL) => {
const result = {};
for (const s of SEL) {
const els = document.querySelectorAll(s);
if (!els.length) { result[s] = '<absent>'; continue; }
const el = els[0];
const cs = getComputedStyle(el);
result[s] = {
count: els.length,
display: cs.display,
opacity: cs.opacity,
visibility: cs.visibility,
background: cs.backgroundColor,
backgroundImage: cs.backgroundImage.slice(0, 80),
zIndex: cs.zIndex,
rect: el.getBoundingClientRect().toJSON(),
};
}
result.__title = document.title;
const playBtn = document.querySelector('.btnPlay, [data-action="play"]');
result.__playBtnText = playBtn
? (playBtn.innerText || playBtn.textContent || '').trim() : null;
result.__bodyClasses = document.body.className;
result.__url = location.href;
// List of all section-title texts so we can diff per-user.
result.__sectionTitles = Array.from(
document.querySelectorAll('.sectionTitleContainer, h2, .sectionHeader')
).map(e => (e.innerText || e.textContent || '').trim()).filter(Boolean);
return result;
}""",
PROBE_SELECTORS,
)
async def sweep_backgrounds(page):
"""Walk visible elements; return ones with non-transparent bg whose rect
overlaps where the pinned backdrop should be visible (top of viewport
above ~70% page height). The criterion is intentionally generous
callers filter via the allowlist."""
return await page.evaluate(
r"""() => {
const isOpaque = (c) => {
if (!c || c === 'rgba(0, 0, 0, 0)' || c === 'transparent') return false;
const m = c.match(/rgba?\(\s*(\d+)\s*,\s*(\d+)\s*,\s*(\d+)(?:\s*,\s*([\d.]+))?\)/);
if (!m) return true;
const a = m[4] !== undefined ? parseFloat(m[4]) : 1.0;
return a > 0.05;
};
const out = [];
const all = document.querySelectorAll('*');
for (const el of all) {
const cs = getComputedStyle(el);
if (cs.display === 'none' || cs.visibility === 'hidden') continue;
if (!isOpaque(cs.backgroundColor)) continue;
const r = el.getBoundingClientRect();
if (r.width < 50 || r.height < 50) continue;
// Skip if bg is already same as body (chained inheritance, no diff)
if (el === document.body || el === document.documentElement) continue;
// Build a signature so consumers can match against allowlist
const cls = (el.className && typeof el.className === 'string')
? '.' + el.className.trim().split(/\s+/).join('.')
: '';
const sig = el.tagName.toLowerCase() + (el.id ? '#' + el.id : '') + cls;
out.push({
sig: sig.slice(0, 200),
tag: el.tagName.toLowerCase(),
id: el.id || null,
classes: (typeof el.className === 'string') ? el.className : '',
background: cs.backgroundColor,
rect: { x: r.x, y: r.y, w: r.width, h: r.height },
zIndex: cs.zIndex,
});
}
return out;
}"""
)
def filter_regressions(bg_elements, viewport_w, viewport_h):
"""Apply allowlist + overlap heuristics → regression list.
A bg element is flagged iff:
- It is NOT in the allowlist (any allowlist class appears in its sig).
- Its rect overlaps the visible viewport (x within [0, vw], y within
a band where backdrop should show i.e. above 80% page height
because content scrolls past pinned backdrop).
- The bg color is "very dark" (R+G+B < 90). Most legit overlays
are clearly tinted; near-black is the failure mode we want.
"""
regressions = []
for el in bg_elements:
sig = el["sig"]
if any(allow in sig for allow in BG_ALLOWLIST):
continue
bg = el["background"]
# Parse rgb sum
try:
nums = [int(x) for x in bg.replace("rgba(", "").replace("rgb(", "")
.replace(")", "").split(",")[:3]]
except Exception:
continue
if sum(nums) > 90:
continue
rect = el["rect"]
if rect["x"] + rect["w"] < 0 or rect["x"] > viewport_w:
continue
regressions.append(el)
return regressions
async def click_play_and_observe(page):
"""Find Play, click, wait 10s, return playback state + new errors."""
pre_console_marker = await page.evaluate("() => Date.now()")
state = {"clicked": False, "selector_used": None, "error": None}
# Try the canonical button selectors in priority order
for sel in [".btnPlay", "[data-action=\"play\"]", "button[is=\"emby-button\"][data-action=\"play\"]"]:
try:
btn = await page.query_selector(sel)
if btn:
box = await btn.bounding_box()
if box and box["width"] > 0:
await btn.click(timeout=5000)
state["clicked"] = True
state["selector_used"] = sel
break
except Exception as e:
state["error"] = f"{sel}: {e}"
if not state["clicked"]:
# Fallback: keyboard 'p' which Jellyfin web binds to play
try:
await page.keyboard.press("p")
state["clicked"] = True
state["selector_used"] = "kbd:p"
except Exception as e:
state["error"] = (state.get("error") or "") + f"; kbd:{e}"
return state
await asyncio.sleep(10)
state["video"] = await page.evaluate("""() => {
const v = document.querySelector('video');
if (!v) return { present: false };
const rect = v.getBoundingClientRect();
return {
present: true,
src: (v.src || '').slice(0, 200),
currentTime: v.currentTime,
paused: v.paused,
ended: v.ended,
readyState: v.readyState,
networkState: v.networkState,
error: v.error ? { code: v.error.code, message: v.error.message } : null,
videoWidth: v.videoWidth,
videoHeight: v.videoHeight,
duration: v.duration,
rect: { x: rect.x, y: rect.y, w: rect.width, h: rect.height },
buffered_ranges: v.buffered.length,
};
}""")
state["pre_marker"] = pre_console_marker
return state
async def run_one(p, user, password, role, run_idx, console_messages, network_failures):
"""Execute the full probe sequence for one user. Returns dict for JSON."""
print(f"\n=== Run {run_idx}: {role} ({user}) ===")
auth = login(user, password)
token = auth["AccessToken"]
user_id = auth["User"]["Id"]
server_id = auth["ServerId"]
is_admin = auth["User"].get("Policy", {}).get("IsAdministrator", False)
print(f"[+] Auth OK uid={user_id} admin={is_admin}")
items = resolve_items(token, user_id)
if not items:
print("[!] No items resolvable — aborting run")
return {"role": role, "user": user, "is_admin": is_admin, "items": [],
"error": "no items"}
print(f"[+] Items: {[(i[1], i[2]) for i in items]}")
runs = []
browser = await p.chromium.launch(
headless=True,
args=["--no-sandbox", "--disable-dev-shm-usage",
"--autoplay-policy=no-user-gesture-required"])
ctx = await browser.new_context(
viewport={"width": 1600, "height": 900},
ignore_https_errors=True)
page = await ctx.new_page()
page.on("console", lambda m: console_messages.append(
{"role": role, "user": user, "type": m.type, "text": m.text}))
page.on("requestfailed", lambda r: network_failures.append(
{"role": role, "user": user, "method": r.method, "url": r.url,
"failure": str(r.failure)}))
page.on("response", lambda r: None if r.status < 400 else
network_failures.append({"role": role, "user": user, "status": r.status,
"url": r.url}))
# --- form login (mirrors v1) ---
await page.goto(f"{URL}/web/", wait_until="networkidle", timeout=30000)
await asyncio.sleep(3)
try:
await page.wait_for_selector("input", timeout=20000)
inputs = await page.evaluate(
"() => Array.from(document.querySelectorAll('input')).map(i => "
"({id:i.id, name:i.name, type:i.type, placeholder:i.placeholder}))")
user_sel = pass_sel = None
for i in inputs:
fid, fname, ftype = i.get("id", ""), i.get("name", ""), i.get("type", "")
if not user_sel and (ftype == "text" or "user" in (fid+fname).lower()
or "name" in (fid+fname).lower()):
user_sel = f"#{fid}" if fid else f'input[name="{fname}"]'
if not pass_sel and ftype == "password":
pass_sel = f"#{fid}" if fid else f'input[name="{fname}"]'
if user_sel and pass_sel:
await page.fill(user_sel, user)
await page.fill(pass_sel, password)
await page.keyboard.press("Enter")
await page.wait_for_load_state("networkidle", timeout=20000)
await asyncio.sleep(2)
print(f"[+] form login OK as {user}")
else:
print("[!] login fields not found — continuing with API token")
except Exception as e:
print(f"[!] form login failed: {e}")
for item_id, label, kind in items:
target = f"{URL}/web/#/details?id={item_id}&serverId={server_id}"
print(f"\n[*] {role}/{kind}: {label}{target}")
await page.goto(target, wait_until="networkidle", timeout=30000)
await asyncio.sleep(4)
probe = await probe_dom(page)
viewport = page.viewport_size
vw, vh = viewport["width"], viewport["height"]
# Top + scrolled screenshots
safe_user = user.replace("@", "_").replace("/", "_")
key = f"{safe_user}-{kind}"
top_png = os.path.join(OUT, f"{key}-top.png")
await page.screenshot(path=top_png, full_page=False)
await page.evaluate("() => window.scrollTo(0, document.body.scrollHeight * 0.5)")
await asyncio.sleep(1)
mid_png = os.path.join(OUT, f"{key}-mid.png")
await page.screenshot(path=mid_png, full_page=False)
await page.evaluate("() => window.scrollTo(0, document.body.scrollHeight)")
await asyncio.sleep(1)
bot_png = os.path.join(OUT, f"{key}-bot.png")
await page.screenshot(path=bot_png, full_page=False)
# Background sweep at scroll-bottom (where INC4-style bands manifest)
bg_elements = await sweep_backgrounds(page)
regressions = filter_regressions(bg_elements, vw, vh)
print(f"[*] bg elements: {len(bg_elements)} regressions: {len(regressions)}")
# Click Play and observe
await page.evaluate("() => window.scrollTo(0, 0)")
await asyncio.sleep(1)
play_state = await click_play_and_observe(page)
play_png = os.path.join(OUT, f"{key}-play.png")
await page.screenshot(path=play_png, full_page=False)
# Diff vs golden
diffs = []
for shot in [(top_png, "top"), (mid_png, "mid"), (bot_png, "bot"),
(play_png, "play")]:
golden = os.path.join(OUT, "golden", f"{key}-{shot[1]}.png")
if PIL_OK and os.path.exists(golden):
try:
a = Image.open(shot[0]).convert("RGB")
b = Image.open(golden).convert("RGB")
if a.size != b.size:
diffs.append({"shot": shot[1], "error": "size mismatch"})
continue
diff_img = ImageChops.difference(a, b)
bbox = diff_img.getbbox()
diff_path = os.path.join(OUT, f"{key}-{shot[1]}-diff.png")
diff_img.save(diff_path)
# Numeric mismatch ratio
hist = diff_img.histogram()
nonzero = sum(hist[i] for i in range(1, 256))
total = a.size[0] * a.size[1] * 3
ratio = nonzero / total if total else 0
diffs.append({"shot": shot[1], "bbox": bbox, "ratio": ratio,
"diff_path": diff_path})
except Exception as e:
diffs.append({"shot": shot[1], "error": str(e)})
runs.append({
"item_id": item_id, "label": label, "kind": kind,
"screenshots": {"top": top_png, "mid": mid_png, "bot": bot_png,
"play": play_png},
"probe": probe,
"play": play_state,
"bg_count": len(bg_elements),
"regressions": regressions,
"diffs_vs_golden": diffs,
})
await browser.close()
return {"role": role, "user": user, "is_admin": is_admin,
"items": runs}
def section_title_diff(admin_run, guest_run):
"""Return sections present for admin but not guest (admin-only carousels)."""
diffs = []
a_items = {i["kind"]: i for i in admin_run.get("items", [])}
g_items = {i["kind"]: i for i in guest_run.get("items", [])}
for kind in a_items:
if kind not in g_items:
continue
a_titles = set(a_items[kind].get("probe", {}).get("__sectionTitles", []))
g_titles = set(g_items[kind].get("probe", {}).get("__sectionTitles", []))
only_admin = sorted(a_titles - g_titles)
only_guest = sorted(g_titles - a_titles)
if only_admin or only_guest:
diffs.append({"kind": kind, "only_admin": only_admin,
"only_guest": only_guest})
return diffs
def grade(result):
"""Decide pass/fail. Returns (exit_code, summary)."""
issues = []
for run in result["runs"]:
for item in run.get("items", []):
v = item.get("play", {}).get("video", {})
if not v.get("present"):
issues.append(f"{run['user']}/{item['kind']}: <video> absent")
elif v.get("error"):
issues.append(f"{run['user']}/{item['kind']}: video error "
f"code={v['error'].get('code')}")
elif v.get("readyState", 0) < 2:
issues.append(f"{run['user']}/{item['kind']}: video readyState="
f"{v.get('readyState')} (no current data)")
elif v.get("paused") and v.get("currentTime", 0) == 0:
issues.append(f"{run['user']}/{item['kind']}: video paused at t=0")
if item.get("regressions"):
issues.append(f"{run['user']}/{item['kind']}: "
f"{len(item['regressions'])} bg regression(s)")
return (2 if issues else 0, issues)
async def main():
console_messages = []
network_failures = []
print(f"[+] Target: {URL}")
print(f"[+] OUT: {OUT}")
print(f"[+] Admin: {ADMIN_USER}")
print(f"[+] Guest: {GUEST_USER}")
async with async_playwright() as p:
admin_run = await run_one(p, ADMIN_USER, ADMIN_PASS, "admin", 1,
console_messages, network_failures)
guest_run = await run_one(p, GUEST_USER, GUEST_PASS, "guest", 2,
console_messages, network_failures)
section_diff = section_title_diff(admin_run, guest_run)
result = {
"url": URL,
"timestamp": int(time.time()),
"runs": [admin_run, guest_run],
"section_title_diff": section_diff,
"console": console_messages[-200:],
"network_failures": network_failures[-200:],
}
code, issues = grade(result)
result["issues"] = issues
result["exit_code"] = code
with open(os.path.join(OUT, "probe.json"), "w") as f:
json.dump(result, f, indent=2, default=str)
print(f"\n=== SUMMARY ===")
print(f"console: {len(console_messages)} network failures: {len(network_failures)}")
print(f"section diffs: {len(section_diff)}")
if section_diff:
for d in section_diff:
if d["only_admin"]:
print(f" admin-only ({d['kind']}): {d['only_admin']}")
if issues:
print(f"ISSUES ({len(issues)}):")
for i in issues:
print(f" - {i}")
else:
print("no issues detected")
print(f"probe.json: {os.path.join(OUT, 'probe.json')}")
sys.exit(code)
if __name__ == "__main__":
try:
asyncio.run(main())
except urllib.error.HTTPError as e:
print(f"[!] HTTP error during login: {e}")
sys.exit(1)
except Exception as e:
print(f"[!] fatal: {e}")
raise

View file

@ -1,186 +0,0 @@
#!/usr/bin/env python3
"""ARRFLIX headless smoke-test. Logs in via API, navigates to a detail page,
captures screenshot + console errors + network failures + computed-style for
backdrop. Pass dev or prod URL as argv[1]."""
import sys, json, time, os, asyncio, urllib.request, urllib.error
from playwright.async_api import async_playwright
URL = sys.argv[1] if len(sys.argv) > 1 else "https://dev.arrflix.s8n.ru"
USER = sys.argv[2] if len(sys.argv) > 2 else "guest-mirror"
PASS = sys.argv[3] if len(sys.argv) > 3 else "dev-test-guest"
ITEM = sys.argv[4] if len(sys.argv) > 4 else None # auto-pick first Series if absent
OUT = sys.argv[5] if len(sys.argv) > 5 else "/tmp/arrflix-headless"
os.makedirs(OUT, exist_ok=True)
DEVICE = "headless-test"
DEVICE_ID = "headless-test-2026-05-09"
CLIENT = "Headless"
VERSION = "1.0"
def auth_header(token=None):
h = (f'MediaBrowser Client="{CLIENT}", Device="{DEVICE}", '
f'DeviceId="{DEVICE_ID}", Version="{VERSION}"')
if token:
h += f', Token="{token}"'
return h
def api_post(path, body, token=None):
req = urllib.request.Request(
f"{URL}{path}",
data=json.dumps(body).encode(),
headers={
"Authorization": auth_header(token),
"Content-Type": "application/json",
},
method="POST",
)
ctx = __import__("ssl")._create_unverified_context()
with urllib.request.urlopen(req, context=ctx) as r:
return json.loads(r.read())
def api_get(path, token=None):
req = urllib.request.Request(
f"{URL}{path}",
headers={"Authorization": auth_header(token)},
)
ctx = __import__("ssl")._create_unverified_context()
with urllib.request.urlopen(req, context=ctx) as r:
return json.loads(r.read())
def login():
r = api_post("/Users/AuthenticateByName",
{"Username": USER, "Pw": PASS})
return r["AccessToken"], r["User"]["Id"], r["ServerId"]
async def main():
token, user_id, server_id = login()
print(f"[+] Authenticated as {USER} ({user_id})")
item_id = ITEM
if not item_id:
items = api_get(
f"/Users/{user_id}/Items?Recursive=true&IncludeItemTypes=Series&Limit=5",
token)
if items.get("Items"):
item_id = items["Items"][0]["Id"]
print(f"[+] Auto-picked Series: {items['Items'][0]['Name']} ({item_id})")
else:
print("[!] No series found, falling back to root")
console_messages = []
network_failures = []
async with async_playwright() as p:
browser = await p.chromium.launch(headless=True,
args=["--no-sandbox", "--disable-dev-shm-usage"])
ctx = await browser.new_context(
viewport={"width": 1600, "height": 900},
ignore_https_errors=True)
page = await ctx.new_page()
page.on("console", lambda m: console_messages.append(
f"[{m.type}] {m.text}"))
page.on("requestfailed", lambda r: network_failures.append(
f"{r.method} {r.url} :: {r.failure}"))
page.on("response", lambda r: None if r.status < 400 else
network_failures.append(f"HTTP {r.status} {r.url}"))
# Auth via login form
await page.goto(f"{URL}/web/", wait_until="networkidle", timeout=30000)
await asyncio.sleep(3)
# Wait for any input rendered by SPA
try:
await page.wait_for_selector("input", timeout=20000)
inputs = await page.evaluate(
"() => Array.from(document.querySelectorAll('input')).map(i => ({id:i.id, name:i.name, type:i.type, placeholder:i.placeholder}))")
print(f"[*] inputs: {inputs}")
# Find username input by heuristic
user_sel = None
pass_sel = None
for i in inputs:
fid, fname, ftype = i.get('id',''), i.get('name',''), i.get('type','')
if not user_sel and (ftype == 'text' or 'user' in (fid+fname).lower() or 'name' in (fid+fname).lower()):
user_sel = f'#{fid}' if fid else f'input[name="{fname}"]'
if not pass_sel and ftype == 'password':
pass_sel = f'#{fid}' if fid else f'input[name="{fname}"]'
print(f"[*] user_sel={user_sel} pass_sel={pass_sel}")
if user_sel and pass_sel:
await page.fill(user_sel, USER)
await page.fill(pass_sel, PASS)
await page.keyboard.press("Enter")
await page.wait_for_load_state("networkidle", timeout=20000)
await asyncio.sleep(2)
print("[+] logged in via form")
else:
print("[!] could not locate login fields")
except Exception as e:
print(f"[!] form login failed: {e}")
# Navigate to detail page
target = (f"{URL}/web/#/details?id={item_id}&serverId={server_id}"
if item_id else f"{URL}/web/")
print(f"[*] navigating: {target}")
await page.goto(target, wait_until="networkidle", timeout=30000)
await asyncio.sleep(4) # let SPA paint backdrop
# Probe key DOM elements (extended)
probe = await page.evaluate("""() => {
const result = {};
const sel = ['.itemBackdrop', '.detailBackdrop', '.backdropContainer',
'.backgroundContainer', '.layout-desktop',
'body', '#reactRoot', '.itemDetailPage',
'video', '.htmlvideoplayer', '.btnPlay', '.detailPagePrimaryContainer',
'.detailSection', '.detailVerticalSection', '.itemsContainer',
'.padded-bottom-page', '.mainAnimatedPages', '.pageContainer',
'.cardScalable', '.scrollSlider', '.sectionTitleContainer',
'.detailPageContent', '.detailPageWrapperContainer'];
for (const s of sel) {
const els = document.querySelectorAll(s);
if (els.length === 0) { result[s] = '<absent>'; continue; }
const el = els[0];
const cs = getComputedStyle(el);
result[s] = {
count: els.length,
display: cs.display,
opacity: cs.opacity,
visibility: cs.visibility,
background: cs.backgroundColor,
backgroundImage: cs.backgroundImage.slice(0, 80),
zIndex: cs.zIndex,
rect: el.getBoundingClientRect().toJSON(),
};
}
result['__title'] = document.title;
const playBtn = document.querySelector('.btnPlay, [data-action="play"]');
result['__playBtnText'] = playBtn ? (playBtn.innerText || playBtn.textContent || '').trim() : null;
result['__bodyClasses'] = document.body.className;
result['__url'] = location.href;
return result;
}""")
# Two screenshots: top viewport + scrolled to mid-page (so fixed backdrop renders correctly)
screenshot = os.path.join(OUT, f"{URL.replace('https://','').replace('.','_')}-detail.png")
await page.screenshot(path=screenshot, full_page=False)
# Scroll halfway down to verify pinned backdrop persists
await page.evaluate("() => window.scrollTo(0, document.body.scrollHeight * 0.5)")
await asyncio.sleep(1)
scrolled = os.path.join(OUT, f"{URL.replace('https://','').replace('.','_')}-scrolled.png")
await page.screenshot(path=scrolled, full_page=False)
print(f"[+] screenshot: {screenshot}")
with open(os.path.join(OUT, "probe.json"), "w") as f:
json.dump({
"url": URL,
"user": USER,
"item": item_id,
"probe": probe,
"console": console_messages[-50:],
"network_failures": network_failures[-50:],
}, f, indent=2)
print(f"[+] probe.json: {os.path.join(OUT, 'probe.json')}")
print(f"[+] console msgs: {len(console_messages)}")
print(f"[+] network failures: {len(network_failures)}")
await browser.close()
asyncio.run(main())

View file

@ -1,157 +0,0 @@
#!/usr/bin/env python3
"""Inject the ARRFLIX middle-theme v6 (logo center, Movies/Series left, search right)
into a Jellyfin web overlay's index.html. Idempotent — run repeatedly without drift.
Markers:
/* ARRFLIX-MIDDLE-THEME-BEGIN */ ... /* ARRFLIX-MIDDLE-THEME-END */ inside <style> and <script>
<!--ARRFLIX-FAVICON-BEGIN--> ... <!--ARRFLIX-FAVICON-END--> between <link> tags
Usage:
python3 bin/inject-middle-theme.py [target.html]
ARRFLIX_OVERLAY_PATH=/opt/docker/jellyfin/web-overrides/index.html python3 bin/inject-middle-theme.py
Default target: <repo_root>/web-overrides/index.html
Assets read from <repo_root>/web-overrides/assets/:
- arrflix-A.b64 favicon base64 (no data: prefix)
- arrflix-wordmark.b64-url center-logo data-URL (with data: prefix)
Doc 29 covers the design, the auth gate, and the video-page hide rule.
"""
import os, re, sys, pathlib, time
ROOT = pathlib.Path(__file__).resolve().parent.parent
DEFAULT_TARGET = ROOT / "web-overrides" / "index.html"
ASSETS = ROOT / "web-overrides" / "assets"
target = pathlib.Path(sys.argv[1]) if len(sys.argv) > 1 else pathlib.Path(os.environ.get("ARRFLIX_OVERLAY_PATH", DEFAULT_TARGET))
if not target.exists():
sys.exit(f"target overlay not found: {target}")
logo_a_b64 = (ASSETS / "arrflix-A.b64").read_text(encoding="utf-8").strip()
wordmark_url = (ASSETS / "arrflix-wordmark.b64-url").read_text(encoding="utf-8").strip()
START = "/* ARRFLIX-MIDDLE-THEME-BEGIN */"
END = "/* ARRFLIX-MIDDLE-THEME-END */"
CSS = (
"body.arrflix-themed .skinHeader .headerTop{display:flex!important;align-items:center;position:relative;min-height:48px}\n"
"body.arrflix-themed .skinHeader .headerLeft,body.arrflix-themed .skinHeader .headerRight{flex:1 1 0;display:flex;align-items:center}\n"
"body.arrflix-themed .skinHeader .headerLeft{justify-content:flex-start;gap:.4em}\n"
"body.arrflix-themed .skinHeader .headerRight{justify-content:flex-end}\n"
"body.arrflix-themed .skinHeader .headerHomeButton,body.arrflix-themed .skinHeader .pageTitleWithLogo{display:none!important}\n"
"body.arrflix-themed .skinHeader .headerLeft > h3.pageTitle:not(.pageTitleWithLogo){display:none!important}\n"
"body.arrflix-themed .skinHeader .headerCastButton,body.arrflix-themed .skinHeader .headerSyncButton{display:none!important}\n"
"body.arrflix-themed .headerTabs.sectionTabs{display:none!important}\n"
"/* Hide entire header during video playback */\n"
"body.arrflix-video-active:not(:has(#loginPage:not(.hide))) .skinHeader,body.arrflix-video-active .arrflix-headerLogo,body.arrflix-video-active .arrflix-nav{display:none!important}\n"
".arrflix-headerLogo{position:absolute;left:50%;top:50%;transform:translate(-50%,-50%);width:120px;height:38px;"
"background:center/contain no-repeat url('" + wordmark_url + "');"
"z-index:1;display:block;text-indent:-9999px;overflow:hidden}\n"
".arrflix-headerLogo:hover{filter:brightness(1.15)}\n"
".arrflix-nav{text-transform:uppercase;letter-spacing:.08em;font-weight:600;padding:0 .9em;color:#fff!important;text-decoration:none;display:inline-flex;align-items:center;height:100%;font-size:.85em}\n"
".arrflix-nav:hover{color:#E50914!important}\n"
)
JS = """
(function(){
function isVideoPage(){
try{
var h=(location.hash||'').toLowerCase();
if (h.indexOf('/video') !== -1) return true;
var osd = document.querySelector('#videoOsdPage:not(.hide)');
if (osd) return true;
var v = document.querySelector('.htmlVideoPlayer:not(.hide), video.htmlvideoplayer:not(.hide)');
if (v && getComputedStyle(v).display !== 'none') return true;
}catch(e){}
return false;
}
function isAuthed(){
try{
if (document.querySelector('.pageContainer.loginPage:not(.hide)')) return false;
if (document.querySelector('#loginPage:not(.hide)')) return false;
var h = (location.hash || '').toLowerCase();
if (h.indexOf('/login') !== -1 || h.indexOf('/wizard') !== -1 || h.indexOf('/forgotpassword') !== -1 || h.indexOf('/selectserver') !== -1) return false;
if (window.ApiClient && typeof window.ApiClient.isLoggedIn === 'function' && !window.ApiClient.isLoggedIn()) return false;
var raw = localStorage.getItem('jellyfin_credentials');
if (!raw) return false;
var creds = JSON.parse(raw);
if (!creds || !creds.Servers || !creds.Servers.length || !creds.Servers[0].AccessToken) return false;
return true;
} catch(e){ return false; }
}
function teardown(){
document.body.classList.remove('arrflix-themed');
var top = document.querySelector('.skinHeader .headerTop'); if (!top) return;
var logo = top.querySelector('.arrflix-headerLogo'); if (logo) logo.remove();
Array.prototype.forEach.call(document.querySelectorAll('.arrflix-nav'), function(n){ n.remove(); });
}
function relayoutHeader(){
document.body.classList.toggle('arrflix-video-active', isVideoPage());
if (!isAuthed()) { teardown(); return; }
var top=document.querySelector('.skinHeader .headerTop'); if(!top) return;
document.body.classList.add('arrflix-themed');
var left=top.querySelector('.headerLeft');
if(left && !left.querySelector('[data-arrflix-nav=\"movies\"]')){
left.insertAdjacentHTML('beforeend',
'<a is=\"emby-linkbutton\" class=\"emby-button arrflix-nav\" data-arrflix-nav=\"movies\" href=\"#/movies.html\">Movies</a>'+
'<a is=\"emby-linkbutton\" class=\"emby-button arrflix-nav\" data-arrflix-nav=\"series\" href=\"#/tv.html\">Series</a>'
);
}
if(!top.querySelector('.arrflix-headerLogo')){
var a=document.createElement('a');
a.className='arrflix-headerLogo';
a.href='#/home.html';
a.setAttribute('aria-label','ARRFLIX home');
a.textContent='ARRFLIX';
var right=top.querySelector('.headerRight');
top.insertBefore(a, right || null);
}
}
function start(){
relayoutHeader();
try{ new MutationObserver(relayoutHeader).observe(document.body,{childList:true,subtree:true}); }catch(e){}
window.addEventListener('hashchange', relayoutHeader);
setInterval(relayoutHeader,1500);
}
if(document.readyState==='loading') document.addEventListener('DOMContentLoaded',start,{once:true}); else start();
})();
"""
FAVICON_LINKS = (
"<!--ARRFLIX-FAVICON-BEGIN-->"
"<link rel=\"icon\" type=\"image/png\" sizes=\"180x180\" data-arrflix-icon=\"A\" href=\"data:image/png;base64," + logo_a_b64 + "\">"
"<link rel=\"apple-touch-icon\" sizes=\"180x180\" data-arrflix-icon=\"A\" href=\"data:image/png;base64," + logo_a_b64 + "\">"
"<!--ARRFLIX-FAVICON-END-->"
)
FAVICON_HIJACK_JS = (
"<script>/* ARRFLIX-FAVICON-HIJACK-BEGIN */"
"(function(){"
"var A_URL='data:image/png;base64," + logo_a_b64 + "';"
"function pin(){"
"Array.prototype.forEach.call(document.querySelectorAll('link[rel=\"shortcut icon\"], link[rel=\"icon\"], link[rel=\"apple-touch-icon\"]'),function(l){"
"if(l.getAttribute('data-arrflix-icon')==='A')return;"
"if((l.href||'').indexOf('data:image/png')!==-1 && l.href.length>200 && l.getAttribute('data-arrflix-icon')!=='A'){l.parentNode&&l.parentNode.removeChild(l);}"
"});"
"Array.prototype.forEach.call(document.querySelectorAll('link[data-arrflix-icon=\"A\"]'),function(l){if(l.href!==A_URL) l.href=A_URL;});"
"}"
"function start(){pin();try{new MutationObserver(pin).observe(document.head||document.documentElement,{childList:true,subtree:true,attributes:true,attributeFilter:['href']});}catch(e){}setInterval(pin,1000);}"
"if(document.readyState==='loading')document.addEventListener('DOMContentLoaded',start,{once:true});else start();"
"})();"
"/* ARRFLIX-FAVICON-HIJACK-END */</script>"
)
src = target.read_text(encoding="utf-8")
src = re.sub(re.escape("<style>" + START) + r".*?" + re.escape(END + "</style>"), "", src, flags=re.DOTALL)
src = re.sub(re.escape("<script>" + START) + r".*?" + re.escape(END + "</script>"), "", src, flags=re.DOTALL)
src = re.sub(r"<!--ARRFLIX-FAVICON-BEGIN-->.*?<!--ARRFLIX-FAVICON-END-->", "", src, flags=re.DOTALL)
src = re.sub(r"<script>/\* ARRFLIX-FAVICON-HIJACK-BEGIN \*/.*?/\* ARRFLIX-FAVICON-HIJACK-END \*/</script>", "", src, flags=re.DOTALL)
PATCH = "<style>" + START + CSS + END + "</style>" + "<script>" + START + JS + END + "</script>" + FAVICON_LINKS + FAVICON_HIJACK_JS
if "</head>" not in src:
sys.exit("no </head> in target")
src2 = src.replace("</head>", PATCH + "</head>", 1)
backup = target.with_suffix(target.suffix + f".bak.pre-middle-v6.{int(time.time())}")
backup.write_text(target.read_text(encoding="utf-8"), encoding="utf-8")
target.write_text(src2, encoding="utf-8")
print(f"OK v6 wrote {len(src2)} bytes to {target}; backup at {backup}")

View file

@ -14,104 +14,6 @@ SHIM = MARKER_BEGIN + r"""
(function(){
var TITLE = 'ARRFLIX';
var BARE_RE = /^Jellyfin$/i;
/* === English-lockdown (synchronous, runs before Jellyfin bundle) ===
Pins UI locale to en-US so the SPA never reads navigator.language
or the user's stored preference. Belt-and-braces against:
- localStorage keys the SPA reads on boot
- navigator.language / navigator.languages getters
- fetch / XHR Accept-Language header (best-effort; most browsers
block JS from setting it, but Jellyfin sometimes does)
- user-config save round-trip (rewrite UICulture en-US before send) */
try {
var LS_KEYS = ['appLanguage','selectedlanguage','selectedlocale','language','locale','culture'];
for (var i=0;i<LS_KEYS.length;i++){
try { localStorage.setItem(LS_KEYS[i], 'en-US'); } catch(e){}
}
} catch(e){}
try {
var EN = ['en-US','en'];
Object.defineProperty(Navigator.prototype, 'language', { get:function(){return 'en-US';}, configurable:true });
Object.defineProperty(Navigator.prototype, 'languages', { get:function(){return EN.slice();}, configurable:true });
} catch(e){
/* fallback for engines that won't let us redefine on the prototype */
try { Object.defineProperty(navigator, 'language', { get:function(){return 'en-US';}, configurable:true }); } catch(e2){}
try { Object.defineProperty(navigator, 'languages', { get:function(){return ['en-US','en'];}, configurable:true }); } catch(e2){}
}
/* fetch wrapper: strip Accept-Language on outbound requests, and rewrite
any user-config save body so UICulture is pinned to en-US. */
try {
if (window.fetch) {
var _origFetch = window.fetch;
window.fetch = function(input, init){
try {
init = init || {};
/* strip Accept-Language if present on a plain object headers init */
if (init.headers) {
if (init.headers instanceof Headers) {
try { init.headers.delete('Accept-Language'); } catch(e){}
} else if (typeof init.headers === 'object') {
for (var k in init.headers){ if (k && k.toLowerCase() === 'accept-language') { try { delete init.headers[k]; } catch(e){} } }
}
}
/* rewrite user-config save: POST /Users/{id}/Configuration */
var url = (typeof input === 'string') ? input : (input && input.url) || '';
var method = (init.method || (input && input.method) || 'GET').toUpperCase();
if (url && /\/Users\/[^/]+\/Configuration(\?|$)/.test(url) && method === 'POST' && init.body) {
try {
var body = init.body;
if (typeof body === 'string') {
var obj = JSON.parse(body);
if (obj && typeof obj === 'object') {
obj.UICulture = 'en-US';
init.body = JSON.stringify(obj);
}
}
} catch(e){}
}
} catch(e){}
return _origFetch.call(this, input, init);
};
}
} catch(e){}
/* XHR wrapper: strip Accept-Language; rewrite user-config save body. */
try {
if (window.XMLHttpRequest) {
var _open = XMLHttpRequest.prototype.open;
var _setHeader = XMLHttpRequest.prototype.setRequestHeader;
var _send = XMLHttpRequest.prototype.send;
XMLHttpRequest.prototype.open = function(method, url){
this.__arrflix_method = (method || 'GET').toUpperCase();
this.__arrflix_url = url || '';
return _open.apply(this, arguments);
};
XMLHttpRequest.prototype.setRequestHeader = function(name, value){
if (name && String(name).toLowerCase() === 'accept-language') return;
return _setHeader.apply(this, arguments);
};
XMLHttpRequest.prototype.send = function(body){
try {
if (this.__arrflix_url && /\/Users\/[^/]+\/Configuration(\?|$)/.test(this.__arrflix_url) && this.__arrflix_method === 'POST' && typeof body === 'string') {
try {
var obj = JSON.parse(body);
if (obj && typeof obj === 'object') {
obj.UICulture = 'en-US';
body = JSON.stringify(obj);
}
} catch(e){}
}
} catch(e){}
return _send.call(this, body);
};
}
} catch(e){}
/* Re-pin localStorage on every visibility change (SPA may rewrite on user save) */
function pinLocale(){
try {
var L = ['appLanguage','selectedlanguage','selectedlocale','language','locale','culture'];
for (var i=0;i<L.length;i++){ try { if (localStorage.getItem(L[i]) !== 'en-US') localStorage.setItem(L[i], 'en-US'); } catch(e){} }
} catch(e){}
}
/* === end english-lockdown synchronous block === */
function getFavicon(){
var l = document.querySelector('link[rel="shortcut icon"], link[rel="icon"]');
return l && l.href ? l.href : null;
@ -147,7 +49,7 @@ SHIM = MARKER_BEGIN + r"""
} catch(e){}
}
function start(){
lockTitle(); lockFavicon(); nukeSettings(); pinLocale();
lockTitle(); lockFavicon(); nukeSettings();
try {
var head = document.head || document.querySelector('head');
if (head && window.MutationObserver) {
@ -165,7 +67,6 @@ SHIM = MARKER_BEGIN + r"""
var fav = getFavicon();
if (fav && fav.indexOf('data:image') !== 0) lockFavicon();
nukeSettings();
pinLocale();
}, 1000);
}
if (document.readyState === 'loading') {

View file

@ -1,665 +0,0 @@
#!/usr/bin/env python3
"""ARRFLIX prod-vs-dev playback divergence test (2026-05-09).
Runs the SAME flow against arrflix.s8n.ru (prod) and dev.arrflix.s8n.ru (dev)
for the same physical file (Mike Nolan Show S01E04 Ding Dong Delli.mkv,
H.264+AAC) and produces a side-by-side diff:
- URL of master.m3u8 / Videos/{id}/stream
- PlaybackInfo response MediaSources[0] (DirectPlay/DirectStream/Transcode)
- Final <video> element state at t=5/10/20/30s after Play
- Server ffmpeg cmdline (if transcoding) from docker logs
- HTTP status of all /Videos /Items /master.m3u8 /PlaybackInfo /Audio
/stream requests
Artifacts: /tmp/arrflix-prod-vs-dev/{prod,dev}/{...} + diff.json + diff.md.
Run:
bin/prod-vs-dev-compare.py
"""
import sys, os, json, time, asyncio, ssl, urllib.request, urllib.error, urllib.parse, subprocess, re
from pathlib import Path
from playwright.async_api import async_playwright
OUT = "/tmp/arrflix-prod-vs-dev"
os.makedirs(OUT, exist_ok=True)
SIDES = [
{"side": "prod", "url": "https://arrflix.s8n.ru", "user": "s8n", "pw": "2001dude",
"container": "jellyfin"},
{"side": "dev", "url": "https://dev.arrflix.s8n.ru", "user": "test", "pw": "2001dude",
"container": "jellyfin-dev"},
]
ITEM_ID = "9312799ca24979bd05aad9733ce7ee14" # MNS S01E04 (same on both sides)
ITEM_LABEL = "Mike Nolan Show — S01E04 (Ding Dong Delli)"
DEVICE_ID = "prodvsdev-2026-05-09"
CLIENT = "ProdVsDev"
APIKEY_NAME = "arrflix-prodvsdev-2026-05-09"
CTX = ssl._create_unverified_context()
# ------------------- HTTP helpers -------------------
def auth_h(token=None):
h = (f'MediaBrowser Client="{CLIENT}", Device="cli", DeviceId="{DEVICE_ID}", '
f'Version="1.0"')
if token:
h += f', Token="{token}"'
return h
def http(url, path, method="GET", body=None, token=None):
data = json.dumps(body).encode() if body is not None else None
headers = {
"Authorization": auth_h(token),
"Content-Type": "application/json",
}
req = urllib.request.Request(
f"{url}{path}", data=data, headers=headers, method=method)
raw = urllib.request.urlopen(req, context=CTX, timeout=20).read()
return json.loads(raw) if raw else {}
def login(url, user, pw):
last_err = None
for attempt in range(3):
try:
return http(url, "/Users/AuthenticateByName", "POST",
{"Username": user, "Pw": pw})
except urllib.error.HTTPError as e:
last_err = e
if e.code in (500, 503):
time.sleep(3); continue
raise
raise last_err
def playbackinfo(url, item_id, user_id, token):
"""Mimic the web-client's /PlaybackInfo POST body for a generic browser."""
body = {
"DeviceProfile": {
"MaxStreamingBitrate": 140000000,
"MaxStaticBitrate": 100000000,
"MusicStreamingTranscodingBitrate": 384000,
"DirectPlayProfiles": [
{"Container": "mp4,m4v", "Type": "Video",
"VideoCodec": "h264,hevc,vp9,av1",
"AudioCodec": "aac,mp3,ac3,eac3,opus,flac"},
{"Container": "mkv", "Type": "Video",
"VideoCodec": "h264,hevc,vp9,av1",
"AudioCodec": "aac,mp3,ac3,eac3,opus,flac"},
{"Container": "webm", "Type": "Video",
"VideoCodec": "vp9,av1", "AudioCodec": "opus,vorbis"},
],
"TranscodingProfiles": [
{"Container": "ts", "Type": "Video", "VideoCodec": "h264",
"AudioCodec": "aac", "Protocol": "hls", "Context": "Streaming",
"MaxAudioChannels": "2"},
{"Container": "mp4", "Type": "Video", "VideoCodec": "h264",
"AudioCodec": "aac", "Context": "Static",
"MaxAudioChannels": "2"},
],
"ContainerProfiles": [],
"CodecProfiles": [],
"SubtitleProfiles": [
{"Format": "vtt", "Method": "External"},
{"Format": "srt", "Method": "External"},
],
},
"AutoOpenLiveStream": True,
"IsPlayback": True,
}
return http(url, f"/Items/{item_id}/PlaybackInfo?UserId={user_id}",
"POST", body, token=token)
def make_apikey(url, token, name=APIKEY_NAME):
"""Issue an API key. Jellyfin only takes the name in query string."""
try:
http(url, f"/Auth/Keys?App={name}", "POST", token=token)
except urllib.error.HTTPError:
pass
keys = http(url, "/Auth/Keys", token=token)
for k in keys.get("Items", []):
if k.get("AppName") == name:
return k.get("AccessToken")
return None
def del_apikey(url, token, name=APIKEY_NAME):
try:
keys = http(url, "/Auth/Keys", token=token)
for k in keys.get("Items", []):
if k.get("AppName") == name:
http(url, f"/Auth/Keys/{k['AccessToken']}", "DELETE", token=token)
except Exception as e:
print(f"[!] del_apikey({name}): {e}")
# ------------------- Playwright run -------------------
async def run_side(p, side_cfg):
side = side_cfg["side"]; url = side_cfg["url"]
user = side_cfg["user"]; pw = side_cfg["pw"]
side_dir = os.path.join(OUT, side)
os.makedirs(side_dir, exist_ok=True)
# API login
auth = login(url, user, pw)
token = auth["AccessToken"]; uid = auth["User"]["Id"]
server_id = auth["ServerId"]
is_admin = auth["User"].get("Policy", {}).get("IsAdministrator", False)
print(f"\n=== {side} === user={user} uid={uid} admin={is_admin}")
# API-side PlaybackInfo (independent of browser, for canonical record)
pbi_api = playbackinfo(url, ITEM_ID, uid, token)
with open(os.path.join(side_dir, "playbackinfo-api.json"), "w") as f:
json.dump(pbi_api, f, indent=2)
ms = pbi_api.get("MediaSources", [])
if ms:
m = ms[0]
print(f"[{side}] PlaybackInfo (API): DirectPlay={m.get('SupportsDirectPlay')} "
f"DirectStream={m.get('SupportsDirectStream')} "
f"Transcoding={m.get('SupportsTranscoding')} "
f"transcodeUrl={m.get('TranscodingUrl','-')[:80]}")
# API key for this run (caller asked, even if not strictly needed here)
apikey = make_apikey(url, token)
print(f"[{side}] api key: {apikey[:8] if apikey else None}")
# Browser pass
browser = await p.chromium.launch(
headless=True,
args=["--no-sandbox", "--disable-dev-shm-usage",
"--autoplay-policy=no-user-gesture-required",
"--use-fake-ui-for-media-stream"])
ctx = await browser.new_context(
viewport={"width": 1600, "height": 900},
ignore_https_errors=True)
page = await ctx.new_page()
requests, responses, console = [], [], []
pbi_response_bodies = []
def on_request(req):
u = req.url
if any(x in u for x in ["/Videos/", "/Items/", "/master.m3u8",
"/PlaybackInfo", "/Audio/", "/stream"]):
requests.append({"method": req.method, "url": u,
"post": req.post_data[:300] if req.post_data else None})
page.on("request", on_request)
async def on_response(r):
u = r.url
if any(x in u for x in ["/Videos/", "/Items/", "/master.m3u8",
"/PlaybackInfo", "/Audio/", "/stream"]):
entry = {"method": r.request.method, "url": u, "status": r.status}
responses.append(entry)
if "/PlaybackInfo" in u and r.request.method == "POST":
try:
body = await r.json()
pbi_response_bodies.append({"url": u, "body": body})
except Exception:
pass
page.on("response", lambda r: asyncio.create_task(on_response(r)))
page.on("console", lambda m: console.append({"type": m.type,
"text": m.text[:300]}))
# Form login (handles both manual-form and user-avatar landing pages)
await page.goto(f"{url}/web/", wait_until="networkidle", timeout=30000)
await asyncio.sleep(3)
# If we landed on the avatar/user-list selection screen, click "Manual Login"
try:
manual = await page.query_selector(".manualLoginForm a, .btnManual, a.button-link")
if manual:
txt = (await manual.inner_text()).strip().lower()
if "manual" in txt:
await manual.click()
await asyncio.sleep(2)
# Or there might be a direct "Manual Login" button on the avatar grid
manual_btn = await page.query_selector("text=/Manual Login/i")
if manual_btn:
try:
await manual_btn.click(timeout=2000); await asyncio.sleep(1)
except Exception:
pass
except Exception as e:
print(f"[{side}] manual-login click attempt: {e}")
try:
await page.wait_for_selector("input[type=password]", timeout=15000)
# Use the canonical Jellyfin login fields
u_sel = "#txtManualName"
pw_sel = "#txtManualPassword"
# Fall back to dynamic discovery if the canonical IDs are absent
if not await page.query_selector(u_sel):
inputs = await page.evaluate(
"() => Array.from(document.querySelectorAll('input')).map(i => "
"({id:i.id, name:i.name, type:i.type}))")
u_sel = pw_sel = None
for i in inputs:
fid, fname, ftype = i.get("id", ""), i.get("name", ""), i.get("type", "")
if not u_sel and (ftype == "text" or "user" in (fid+fname).lower()
or "name" in (fid+fname).lower()):
u_sel = f"#{fid}" if fid else f'input[name="{fname}"]'
if not pw_sel and ftype == "password":
pw_sel = f"#{fid}" if fid else f'input[name="{fname}"]'
await page.fill(u_sel, user)
await page.fill(pw_sel, pw)
await page.keyboard.press("Enter")
await page.wait_for_load_state("networkidle", timeout=20000)
await asyncio.sleep(3)
print(f"[{side}] form login OK as {user}")
except Exception as e:
print(f"[{side}] form login error: {e}")
# Navigate to detail page
target = f"{url}/web/#/details?id={ITEM_ID}&serverId={server_id}"
print(f"[{side}] goto {target}")
await page.goto(target, wait_until="networkidle", timeout=30000)
await asyncio.sleep(4)
await page.screenshot(path=os.path.join(side_dir, "detail.png"))
# Click Play
play_clicked = False
used_sel = None
for sel in [".btnPlay", "[data-action=\"play\"]"]:
try:
btn = await page.query_selector(sel)
if btn:
box = await btn.bounding_box()
if box and box["width"] > 0:
await btn.click(timeout=5000)
play_clicked = True; used_sel = sel; break
except Exception:
pass
if not play_clicked:
try:
await page.keyboard.press("p"); play_clicked = True; used_sel = "kbd:p"
except Exception:
pass
print(f"[{side}] play clicked={play_clicked} via={used_sel}")
# Sample state at t=5/10/20/30s
timestamps = [5, 10, 20, 30]
samples = []
last = 0
for t in timestamps:
await asyncio.sleep(t - last)
last = t
snap = await page.evaluate("""() => {
const v = document.querySelector('video');
if (!v) return { present: false };
// Sample whether the <video> is painting actual pixels by drawing
// a thumbnail to a hidden canvas and checking the average luma.
// If the average is ~0 (or all-near-zero), the video element is
// rendering opaque black despite claiming to play.
let paintLuma = null, paintRGBSum = null, paintOk = null, paintErr = null;
try {
const c = document.createElement('canvas');
c.width = 32; c.height = 18;
const ctx = c.getContext('2d', { willReadFrequently: true });
ctx.drawImage(v, 0, 0, 32, 18);
const d = ctx.getImageData(0, 0, 32, 18).data;
let r=0,g=0,b=0,n=0;
for (let i=0;i<d.length;i+=4){r+=d[i];g+=d[i+1];b+=d[i+2];n++;}
paintLuma = (0.299*r + 0.587*g + 0.114*b) / n;
paintRGBSum = (r+g+b)/n;
paintOk = paintLuma > 4; // > a few luma above pure black
} catch (e) { paintErr = String(e); }
// Stacking diagnosis: who's on top of the video center?
const r = v.getBoundingClientRect();
const cx = r.x + r.width/2, cy = r.y + r.height/2;
let stackAtVideoCenter = [];
try {
const els = (typeof document.elementsFromPoint === 'function')
? document.elementsFromPoint(cx, cy) : [];
stackAtVideoCenter = els.slice(0, 6).map(e => {
const cs = getComputedStyle(e);
return {
tag: e.tagName.toLowerCase(),
id: e.id || null,
cls: (typeof e.className === 'string' ? e.className : '').slice(0, 80),
bg: cs.backgroundColor,
opacity: cs.opacity,
zIndex: cs.zIndex,
position: cs.position,
isVideo: e === v,
};
});
} catch (e) {}
const osd = document.getElementById('videoOsdPage');
const osdInfo = osd ? {
bg: getComputedStyle(osd).backgroundColor,
display: getComputedStyle(osd).display,
opacity: getComputedStyle(osd).opacity,
position: getComputedStyle(osd).position,
zIndex: getComputedStyle(osd).zIndex,
cls: osd.className,
rect: osd.getBoundingClientRect().toJSON ? osd.getBoundingClientRect().toJSON() : null,
} : null;
return {
present: true,
src: v.src || '',
currentSrc: v.currentSrc || '',
currentTime: v.currentTime,
duration: v.duration,
paused: v.paused,
ended: v.ended,
readyState: v.readyState,
networkState: v.networkState,
error: v.error ? { code: v.error.code, message: v.error.message } : null,
videoWidth: v.videoWidth,
videoHeight: v.videoHeight,
bufferedRanges: v.buffered.length,
bufferedEnd: v.buffered.length ? v.buffered.end(v.buffered.length-1) : 0,
paintLuma, paintRGBSum, paintOk, paintErr,
stackAtVideoCenter,
videoOsdPage: osdInfo,
};
}""")
samples.append({"t": t, "video": snap})
await page.screenshot(path=os.path.join(side_dir, f"play-t{t}.png"))
ct = snap.get('currentTime')
ct_s = f"{ct:.2f}" if isinstance(ct, (int, float)) else str(ct)
pl = snap.get('paintLuma')
pl_s = f"{pl:.1f}" if isinstance(pl, (int, float)) else str(pl)
print(f"[{side}] t={t}s: time={ct_s} "
f"paused={snap.get('paused')} err={snap.get('error')} "
f"dim={snap.get('videoWidth')}x{snap.get('videoHeight')} "
f"rs={snap.get('readyState')} paintLuma={pl_s} paintOk={snap.get('paintOk')}")
# Final src URL fully decoded
final_src = samples[-1]["video"].get("currentSrc") or samples[-1]["video"].get("src", "")
final_src_decoded = urllib.parse.unquote(final_src) if final_src else ""
await browser.close()
# Server side ffmpeg / transcode log
server_logs = ""
try:
server_logs = subprocess.check_output(
["ssh", "-o", "ConnectTimeout=5", "user@192.168.0.100",
f"docker logs --since 2m {side_cfg['container']} 2>&1 | tail -300"],
timeout=15).decode(errors="replace")
except Exception as e:
server_logs = f"(failed to fetch server logs: {e})"
# Extract ffmpeg cmdline + transcode reasons from log
ffmpeg_cmd = None
for line in server_logs.splitlines():
if "ffmpeg" in line.lower() and ("-i " in line or "-f hls" in line or "-c:v" in line):
ffmpeg_cmd = line.strip()
break
transcode_reasons = []
for line in server_logs.splitlines():
if "transcode reason" in line.lower() or "TranscodeReasons" in line:
transcode_reasons.append(line.strip())
# Save artifacts
side_out = {
"side": side, "url": url, "user": user, "uid": uid, "is_admin": is_admin,
"server_id": server_id, "item_id": ITEM_ID, "item_label": ITEM_LABEL,
"play_clicked": play_clicked, "play_selector": used_sel,
"samples": samples,
"final_src": final_src,
"final_src_decoded": final_src_decoded,
"playbackinfo_api": pbi_api,
"playbackinfo_browser_responses": pbi_response_bodies,
"requests": requests,
"responses": responses,
"console": console[-200:],
"ffmpeg_cmdline": ffmpeg_cmd,
"transcode_reasons_log": transcode_reasons,
}
with open(os.path.join(side_dir, "result.json"), "w") as f:
json.dump(side_out, f, indent=2, default=str)
with open(os.path.join(side_dir, "server.log"), "w") as f:
f.write(server_logs)
# Cleanup the temp api key
del_apikey(url, token)
return side_out
# ------------------- Diff & report -------------------
def diff_results(prod, dev):
"""Build the comparison matrix."""
def keyfields(pbi):
ms = pbi.get("MediaSources", [])
if not ms:
return None
m = ms[0]
return {
"Container": m.get("Container"),
"Protocol": m.get("Protocol"),
"SupportsDirectPlay": m.get("SupportsDirectPlay"),
"SupportsDirectStream": m.get("SupportsDirectStream"),
"SupportsTranscoding": m.get("SupportsTranscoding"),
"TranscodingUrl": m.get("TranscodingUrl"),
"TranscodingSubProtocol": m.get("TranscodingSubProtocol"),
"TranscodingContainer": m.get("TranscodingContainer"),
"TranscodeReasons": m.get("TranscodeReasons"),
"Bitrate": m.get("Bitrate"),
"Size": m.get("Size"),
"Path": m.get("Path"),
}
p_pbi = keyfields(prod["playbackinfo_api"])
d_pbi = keyfields(dev["playbackinfo_api"])
last_p = prod["samples"][-1]["video"]
last_d = dev["samples"][-1]["video"]
out = {
"item_id": ITEM_ID, "label": ITEM_LABEL,
"prod_url": prod["url"], "dev_url": dev["url"],
"playback_info_diff": {
"prod": p_pbi, "dev": d_pbi,
"differences": {
k: {"prod": p_pbi.get(k), "dev": d_pbi.get(k)}
for k in (set(p_pbi or {}) | set(d_pbi or {}))
if (p_pbi or {}).get(k) != (d_pbi or {}).get(k)
} if p_pbi and d_pbi else "missing-on-one-side",
},
"video_state_t30": {
"prod": last_p,
"dev": last_d,
"differences": {
k: {"prod": last_p.get(k), "dev": last_d.get(k)}
for k in (set(last_p) | set(last_d))
if last_p.get(k) != last_d.get(k)
},
},
"stream_url_prod": prod.get("final_src_decoded"),
"stream_url_dev": dev.get("final_src_decoded"),
"ffmpeg_cmdline_prod": prod.get("ffmpeg_cmdline"),
"ffmpeg_cmdline_dev": dev.get("ffmpeg_cmdline"),
"transcode_reasons_log_prod": prod.get("transcode_reasons_log"),
"transcode_reasons_log_dev": dev.get("transcode_reasons_log"),
"http_status_diff": [],
}
# HTTP-status diff: for matched URL templates, show statuses where they differ.
def normalise(u):
# Strip /Videos/{id} → /Videos/* and quoting; keep last path segment
u = re.sub(r"/Videos/[a-f0-9]{32}", "/Videos/*", u)
u = re.sub(r"/Items/[a-f0-9]{32}", "/Items/*", u)
u = re.sub(r"\?.*$", "", u)
u = re.sub(r"^https?://[^/]+", "", u)
return u
def status_map(rs):
out = {}
for r in rs:
k = (r["method"], normalise(r["url"]))
out.setdefault(k, []).append(r["status"])
return out
sp = status_map(prod.get("responses", []))
sd = status_map(dev.get("responses", []))
keys = set(sp) | set(sd)
for k in sorted(keys):
if sp.get(k) != sd.get(k):
out["http_status_diff"].append({
"method": k[0], "path": k[1],
"prod": sp.get(k), "dev": sd.get(k),
})
return out
def render_md(diff, prod, dev):
pp = diff["playback_info_diff"].get("prod") or {}
dp = diff["playback_info_diff"].get("dev") or {}
last_p = diff["video_state_t30"]["prod"]
last_d = diff["video_state_t30"]["dev"]
def fmt_bool(x): return "Y" if x else ("N" if x is False else "")
def headline():
# Three failure modes to recognise, in order:
# 1. paused-at-zero → MediaSource attach never fired
# 2. <video>.error → format/decode error
# 3. paint-black → video advances but renders no pixels (DRM-style
# black, or codec-not-actually-decodable in this
# chromium build despite advancing the clock)
bp = bool(last_p.get("paused")) and (last_p.get("currentTime", 0) or 0) < 0.1
bd = bool(last_d.get("paused")) and (last_d.get("currentTime", 0) or 0) < 0.1
if bp and not bd:
return ("prod fails because video stayed paused at t=0 while dev advanced")
if bd and not bp:
return ("dev fails because video stayed paused at t=0 while prod advanced")
if last_p.get("error") and not last_d.get("error"):
return f"prod fails because <video>.error code={last_p['error'].get('code')}"
if last_d.get("error") and not last_p.get("error"):
return f"dev fails because <video>.error code={last_d['error'].get('code')}"
# Paint check
pp_ok = last_p.get("paintOk"); dp_ok = last_d.get("paintOk")
if pp_ok is False and dp_ok is True:
return ("prod fails because <video> advances time but paints all-black "
"(paintLuma~0) while dev paints normally — pixels never reach the canvas")
if dp_ok is False and pp_ok is True:
return ("dev fails because <video> advances time but paints all-black "
"(paintLuma~0) while prod paints normally")
# OSD overlay check
def topel(s): return (s.get("stackAtVideoCenter") or [{}])[0]
pt = topel(last_p); dt = topel(last_d)
def is_opaque_black(bg):
if not bg: return False
try:
nums = [int(x) for x in bg.replace("rgba(","").replace("rgb(","").replace(")","").split(",")[:3]]
return sum(nums) < 30
except Exception: return False
if (not pt.get("isVideo")) and is_opaque_black(pt.get("bg")) \
and (dt.get("isVideo") or not is_opaque_black(dt.get("bg"))):
return (f"prod fails because an opaque-black `{pt.get('tag')}#{pt.get('id') or ''}"
f".{pt.get('cls')}` element is rendered on top of the <video> "
f"(bg={pt.get('bg')}); dev's video is uncovered")
return "neither side errored or painted black explicitly — see HTTP/PlaybackInfo/cmdline diffs"
md = []
md.append(f"# Prod vs Dev — playback divergence test ({time.strftime('%Y-%m-%d %H:%M')})")
md.append("")
md.append(f"Item: **{diff['label']}** (ItemId `{diff['item_id']}`)")
md.append("")
md.append(f"**Headline:** {headline()}")
md.append("")
md.append("## Final video state at t=30s")
md.append("| Field | prod | dev |")
md.append("|---|---|---|")
for k in ["present", "currentTime", "duration", "paused", "ended",
"readyState", "networkState", "error",
"videoWidth", "videoHeight", "bufferedRanges", "bufferedEnd",
"paintLuma", "paintRGBSum", "paintOk"]:
md.append(f"| {k} | `{last_p.get(k)}` | `{last_d.get(k)}` |")
# OSD page styling diff (the smoking gun for prod black-screen)
p_osd = last_p.get("videoOsdPage") or {}
d_osd = last_d.get("videoOsdPage") or {}
md.append("")
md.append("## #videoOsdPage style (the OSD container painted on top of the <video>)")
md.append("| Field | prod | dev |")
md.append("|---|---|---|")
for k in ["bg", "opacity", "position", "zIndex", "display", "cls"]:
md.append(f"| {k} | `{p_osd.get(k)}` | `{d_osd.get(k)}` |")
md.append("")
md.append("## Stack at video center (top → bottom)")
md.append("### prod")
for s in (last_p.get("stackAtVideoCenter") or []):
md.append(f"- `{s.get('tag')}#{s.get('id')}.{s.get('cls')}` "
f"bg=`{s.get('bg')}` z=`{s.get('zIndex')}` pos=`{s.get('position')}` "
f"isVideo={s.get('isVideo')}")
md.append("### dev")
for s in (last_d.get("stackAtVideoCenter") or []):
md.append(f"- `{s.get('tag')}#{s.get('id')}.{s.get('cls')}` "
f"bg=`{s.get('bg')}` z=`{s.get('zIndex')}` pos=`{s.get('position')}` "
f"isVideo={s.get('isVideo')}")
md.append("")
md.append("## Stream URL (decoded)")
md.append(f"- **prod**: `{diff.get('stream_url_prod') or '(empty)'}`")
md.append(f"- **dev**: `{diff.get('stream_url_dev') or '(empty)'}`")
md.append("")
md.append("## PlaybackInfo MediaSources[0]")
md.append("| Field | prod | dev |")
md.append("|---|---|---|")
for k in ["Container", "Protocol", "SupportsDirectPlay",
"SupportsDirectStream", "SupportsTranscoding",
"TranscodingUrl", "TranscodingSubProtocol", "TranscodingContainer",
"TranscodeReasons", "Bitrate", "Size", "Path"]:
md.append(f"| {k} | `{pp.get(k)}` | `{dp.get(k)}` |")
md.append("")
md.append("## ffmpeg cmdline (from docker logs)")
md.append(f"- **prod**: `{diff.get('ffmpeg_cmdline_prod') or '(none — no transcoding observed)'}`")
md.append(f"- **dev**: `{diff.get('ffmpeg_cmdline_dev') or '(none — no transcoding observed)'}`")
md.append("")
md.append("## HTTP status differences")
if diff.get("http_status_diff"):
md.append("| Method | Path | prod | dev |")
md.append("|---|---|---|---|")
for r in diff["http_status_diff"]:
md.append(f"| {r['method']} | `{r['path']}` | {r['prod']} | {r['dev']} |")
else:
md.append("(none — all matched URLs returned the same status code)")
md.append("")
md.append("## Per-sample timeline")
md.append("| t | prod time | prod paused | prod err | dev time | dev paused | dev err |")
md.append("|---|---|---|---|---|---|---|")
for ps, ds in zip(prod["samples"], dev["samples"]):
pv, dv = ps["video"], ds["video"]
md.append(f"| {ps['t']}s | {pv.get('currentTime')} | {pv.get('paused')} | "
f"{pv.get('error')} | {dv.get('currentTime')} | {dv.get('paused')} | "
f"{dv.get('error')} |")
md.append("")
return "\n".join(md)
# ------------------- main -------------------
async def main():
print(f"[+] OUT: {OUT}")
async with async_playwright() as p:
prod = await run_side(p, SIDES[0])
dev = await run_side(p, SIDES[1])
diff = diff_results(prod, dev)
with open(os.path.join(OUT, "diff.json"), "w") as f:
json.dump(diff, f, indent=2, default=str)
md = render_md(diff, prod, dev)
with open(os.path.join(OUT, "diff.md"), "w") as f:
f.write(md)
print("\n=== SUMMARY ===")
last_p = diff["video_state_t30"]["prod"]; last_d = diff["video_state_t30"]["dev"]
print(f"prod t=30: time={last_p.get('currentTime')} paused={last_p.get('paused')} "
f"err={last_p.get('error')} dim={last_p.get('videoWidth')}x{last_p.get('videoHeight')}")
print(f"dev t=30: time={last_d.get('currentTime')} paused={last_d.get('paused')} "
f"err={last_d.get('error')} dim={last_d.get('videoWidth')}x{last_d.get('videoHeight')}")
print(f"diff.json: {os.path.join(OUT, 'diff.json')}")
print(f"diff.md: {os.path.join(OUT, 'diff.md')}")
if __name__ == "__main__":
asyncio.run(main())

View file

@ -1,45 +0,0 @@
# Jellyfin DEV — second instance for theme/branding experimentation
# Deploy path on nullstone: /opt/docker/jellyfin-dev/
# Domain: dev.arrflix.s8n.ru (LAN-only via Pi-hole local DNS + no-guest middleware)
#
# Purpose:
# - Isolated playground for trying themes (Cineplex, ElegantFin, NeutralFin, ...)
# without touching the live arrflix.s8n.ru that real users (marco, house, guest, 5)
# are watching.
# - Same media library mounted READ-ONLY so dev sees the same titles but cannot
# mutate the on-disk library.
# - Separate config/cache so first-run wizard, accounts and branding live here only.
# - LAN-only: no-guest middleware on router; do NOT publish to WAN.
#
# Image pinned to 10.10.3 to match prod for theme parity. Bump prod first, then
# match here, never the other way around.
services:
jellyfin-dev:
image: jellyfin/jellyfin:10.10.3
container_name: jellyfin-dev
restart: unless-stopped
user: "1000:1000"
userns_mode: "host"
environment:
- TZ=Europe/London
- JELLYFIN_PublishedServerUrl=https://dev.arrflix.s8n.ru
volumes:
- /home/docker/jellyfin-dev/config:/config
- /home/docker/jellyfin-dev/cache:/cache
- /home/user/media:/media:ro
networks:
- proxy
labels:
- "traefik.enable=true"
- "traefik.docker.network=proxy"
- "traefik.http.routers.jellyfin-dev.rule=Host(`dev.arrflix.s8n.ru`)"
- "traefik.http.routers.jellyfin-dev.entrypoints=websecure"
- "traefik.http.routers.jellyfin-dev.tls=true"
- "traefik.http.routers.jellyfin-dev.tls.certresolver=letsencrypt"
- "traefik.http.routers.jellyfin-dev.middlewares=security-headers@file,no-guest@file"
- "traefik.http.services.jellyfin-dev.loadbalancer.server.port=8096"
networks:
proxy:
external: true

View file

@ -1,63 +0,0 @@
# 00 — ARRFLIX Technical Overview
ARRFLIX is the operator's premium home-streaming project: AI-upscaled masters,
hand-curated metadata, and a Netflix-faithful viewing surface for a small
trusted set of users.
Under the hood, the stack is **Jellyfin 10.10.3 in Docker on nullstone**, sat
behind **Traefik** with **Let's Encrypt DNS-01 via Gandi** for TLS, name-served
internally by **Pi-hole**, and access-bounded to the LAN (`192.168.0.0/24`)
plus tagged tailnet nodes via a Traefik allowlist middleware. The Jellyfin web
UI is rebranded as ARRFLIX through a `web-overrides/` bind-mount, an SPA
runtime shim, and the **NeutralFin / Cineplex** CSS theme stack — none of the
default Jellyfin chrome, names, or logos are reachable by an unprivileged user.
---
## Architecture
| Layer | Component |
|------------------|------------------------------------------------------------------------------------------------|
| Frontend | Jellyfin web bundle, themed and rebranded as ARRFLIX (web-overrides + NeutralFin/Cineplex CSS) |
| Application | `jellyfin/jellyfin:10.10.3` container on nullstone, sibling `jellyfin-dev` for theme work |
| Reverse proxy | Traefik with **file-provider** routing (docker-label routing flakes for this container) |
| DNS | Pi-hole internal A record: `arrflix.s8n.ru``192.168.0.100` |
| TLS | Let's Encrypt via DNS-01, Gandi LiveDNS provider |
| Storage (media) | RO bind-mount from host `/home/user/media/{movies,tv,…}` → container `/media` |
| Storage (state) | Config + cache + metadata under host `/home/docker/jellyfin/` |
| ACL | Traefik `no-guest@file` middleware: LAN `192.168.0.0/24` + tailnet admin/infra tags only |
| WAN exposure | A record published; router port-forward gated — see `09-wan-exposure.md` |
A second container `jellyfin-dev` runs on `dev.arrflix.s8n.ru` as a behavioural
mirror of prod for theme and branding experiments — same media (read-only),
separate config and users, LAN-only.
---
## Read these in order
1. [`01-artwork-and-images.md`](01-artwork-and-images.md) — how artwork flows through Jellyfin and the curl recipes used to repair a botched first scan.
2. [`02-metadata-and-titles.md`](02-metadata-and-titles.md) — episode/title scraping, `RemoteSearch/Apply`, and the lock-the-series workflow.
3. [`03-subtitles.md`](03-subtitles.md) — subtitle resolution order, sidecar conventions, and the OpenSubtitles plugin setup.
4. [`04-theming-and-users.md`](04-theming-and-users.md) — active theme (Cineplex v1.0.6), server-side branding, multi-user UX, SyncPlay, revert path.
5. [`05-file-structure-rules.md`](05-file-structure-rules.md) — authoritative on-disk layout for Movies / TV / Anime / Music libraries.
6. [`06-per-library-themes.md`](06-per-library-themes.md) — research note on shimming per-library CSS scoping (Movies = Netflix, Anime = Crunchyroll, Music = Spotify).
7. [`07-pre-import-cleanup.md`](07-pre-import-cleanup.md) — normative ruleset for stripping junk from scene/group dumps before import.
8. [`08-filename-normalization.md`](08-filename-normalization.md) — canonical, group-tag-free renaming ruleset between "torrent dump" and the live tree.
9. [`09-wan-exposure.md`](09-wan-exposure.md) — the LAN-only → public-internet plan, server-side changes already applied, router TODOs, and rollback.
10. [`10-spa-runtime-shim.md`](10-spa-runtime-shim.md) — why static `<title>` patching loses to Jellyfin's SPA, and the runtime shim that wins it back.
11. [`11-neutralfin-audit.md`](11-neutralfin-audit.md) — read-only audit of the NeutralFin render gap vs the demo screenshots (no fixes applied).
12. [`12-dev-instance.md`](12-dev-instance.md) — `jellyfin-dev` sibling container: image pinning, mounts, and isolation guarantees.
13. [`13-optimization-audit.md`](13-optimization-audit.md) — read-only performance / capacity / reliability / ops-hygiene audit across REST, host, and container.
14. [`14-theme-audit.md`](14-theme-audit.md) — Cineplex theme audit and the detail-page left-band backdrop diagnosis (forward plan, not a fix).
15. [`15-force-english.md`](15-force-english.md) — root cause of the German Play button and the per-user `UICulture` pin that fixes it.
16. [`16-jellyfin-branding-leaks.md`](16-jellyfin-branding-leaks.md) — exhaustive inventory of every place "Jellyfin" or the teal triangle still leaks to a non-admin.
17. [`17-dev-mirror-and-settings-fix.md`](17-dev-mirror-and-settings-fix.md) — making dev a faithful prod mirror and fixing the non-admin Settings drawer leak (dev only).
---
## See also
- [`../README.md`](../README.md) — ARRFLIX brand-facing project page.
- [`../ADMIN-GUIDE.md`](../ADMIN-GUIDE.md) — operator runbook (day-to-day administration).
- [`../ROADMAP.md`](../ROADMAP.md) — what's next and what's known-broken.

View file

@ -10,7 +10,7 @@ nullstone). Auth header below uses the long-lived API token — replace with you
own `X-Emby-Token` if needed.
```bash
TOKEN="<JELLYFIN_API_TOKEN>"
TOKEN="*redacted*"
H="-H \"Authorization: MediaBrowser Token=${TOKEN}\""
BASE="https://arrflix.s8n.ru"
```

View file

@ -203,7 +203,7 @@ For our Futurama set, all files are single-episode (`s01e01.pl.mkv`), so this di
Step-by-step, with the exact commands run:
```bash
TOKEN=<JELLYFIN_API_TOKEN>
TOKEN=*redacted*
SERIES_ID=156e57437f795e5c8cd80fc98bafaee0 # Futurama
LIB_ID=767bffe4f11c93ef34b805451a696a4e # TV Shows library

View file

@ -90,7 +90,7 @@ User does NOT need to obtain an API key. The plugin embeds its own key (verified
After signup at opensubtitles.com, save creds via API:
```bash
TOKEN=<JELLYFIN_API_TOKEN>
TOKEN=*redacted*
USER='your-opensubtitles-com-username'
PASS='your-opensubtitles-com-password'

View file

@ -6,16 +6,20 @@ ElegantFin v25.12.31 the same day after a Netflix-fidelity-driven survey.
Scope: visual theme, server-side branding, multi-user UX prep, SyncPlay,
maintenance/revert. LAN-only constraints preserved (no public-facing changes).
> Hostname note: this site is being renamed `tv.s8n.ru``arrflix.s8n.ru`
> in the same session. The Jellyfin API endpoints don't care about
> hostname — they're served by the same container. All `curl` examples
> below are reachable as either `https://tv.s8n.ru/...` (legacy) or
> `https://arrflix.s8n.ru/...` (new), as long as Traefik has a SNI cert
> for the name. Internal pin: both names should resolve to `192.168.0.100`
> (see CLAUDE.md memory `feedback_s8n_hosts_override.md`). If a
> hostname's DNS or cert isn't up yet, use
> `--resolve tv.s8n.ru:443:192.168.0.100` on curl — that's how this
> re-theming was applied while `arrflix.s8n.ru` was still missing a cert.
---
## 1. Theme decision: ElegantFin v25.12.31 + ARRFLIX recolor (current)
**As of 2026-05-08 (later in the day), the active theme is ElegantFin
v25.12.31 with the Netflix-red `#E50914` accent recolored over the
default Jellyseerr-blue/violet palette and the ARRFLIX wordmark logo
preserved.** See §3e for the migration details and §1.x ("Previous
themes") below for the Cineplex history that preceded it.
## 1. Theme decision: Cineplex v1.0.6 (Netflix-faithful)
### Candidates surveyed (2026-05-08)
@ -30,18 +34,7 @@ themes") below for the Cineplex history that preceded it.
| zombB / NetfliFin / Finetwo | mostly fork-style replacement of jellyfin-web | varies | varies | n/a | requires image swap or JS injector | DQ — violates "pure CSS, no image swap, no plugins" constraint |
| Ultrachromic (CTalvio) | community CSS | "selectively maintained" | varies | 6/10 — accent-tunable but no Netflix preset | unknown | not Netflix enough |
### Previous themes
The two sub-sections below ("Why Cineplex won", "Tradeoffs", "What it
looks like", "Theme history") are kept verbatim from when Cineplex was
the active theme (earlier on 2026-05-08, before the ElegantFin migration
documented in §3e). They remain useful as the reasoning trail for the
final brand brief — Netflix-faithful was the goal, Cineplex was the
purest expression of that, and the current ElegantFin + recolor stack
is a deliberate tradeoff toward "more polished browsing UI" while
keeping the Netflix-red accent.
#### Why Cineplex won (historical)
### Why Cineplex won
1. **It is actually Netflix.** The CSS literally embeds Netflix Sans
(`https://assets.nflxext.com/ffe/siteui/fonts/netflix-sans/v3/...`) and
@ -69,7 +62,7 @@ keeping the Netflix-red accent.
6. **Cast/crew hide rule still appended** at the bottom of `CustomCss`,
exactly as before.
#### Tradeoffs (honest list, Cineplex era)
### Tradeoffs (honest list)
- **License: none.** Cineplex doesn't declare one. CSS is generally
permissive in practice (you redistribute by `@import`, not by copying)
@ -86,7 +79,7 @@ keeping the Netflix-red accent.
- **Theme footer.** Cineplex doesn't add a brand stamp, so users see no
"Cineplex" tag — cleaner than ElegantFin's footer label was.
#### What Cineplex looked like (live, post-apply)
### What it looks like (live, post-apply)
- **Background:** `#181818` (Finity base) — Netflix-black.
- **Accent:** `#E50914` (canonical Netflix red) on focus rings, progress
@ -99,18 +92,16 @@ keeping the Netflix-red accent.
netflix.com's sign-in page.
- **No theme-brand footer label** any more.
#### Theme history
### Theme history
| Date | Theme | Version | Why changed |
|---|---|---|---|
| 2026-05-08 (earlier today) | ElegantFin | v25.12.31 | Initial Jellyfin theming pass. Picked for activity + safety (most actively maintained CSS in the ecosystem). |
| 2026-05-08 (mid-day) | **Cineplex** | **v1.0.6** | Owner asked for the most Netflix-faithful theme available. ElegantFin's Jellyseerr aesthetic (blue-grey, no red) is too far from Netflix; Cineplex is purpose-built for this look and explicitly targets the 10.10 series we're on. JellyFlix (the genre's elder) is halted. |
| 2026-05-08 (later, current) | **ElegantFin + ARRFLIX recolor** | **v25.12.31** + `#E50914` accent overrides | Owner liked Cineplex's Netflix accent but preferred ElegantFin's polished browsing UI. Best of both: ElegantFin's layout/typography + ARRFLIX brand red overrides. Snapshot tag for rollback: `snapshot-2026-05-08-pre-elegantfin`. See §3e. |
| 2026-05-08 (this entry) | **Cineplex** | **v1.0.6** | Owner asked for the most Netflix-faithful theme available. ElegantFin's Jellyseerr aesthetic (blue-grey, no red) is too far from Netflix; Cineplex is purpose-built for this look and explicitly targets the 10.10 series we're on. JellyFlix (the genre's elder) is halted. |
Rollback paths:
- To Cineplex (Netflix-faithful): apply `snapshots/2026-05-08-pre-elegantfin/branding.json` per `snapshots/2026-05-08-pre-elegantfin/RESTORE.md`.
- To plain ElegantFin (no recolor): see §6b.
- To vanilla Jellyfin: see §6b.
If we ever roll back to ElegantFin, the previous `@import` was
`https://cdn.jsdelivr.net/gh/lscambo13/ElegantFin@v25.12.31/Theme/ElegantFin-jellyfin-theme-build-latest-minified.css`.
The previous incarnation of this doc lives in git history.
---
@ -119,25 +110,25 @@ Rollback paths:
### Branding API (Cineplex, applied 2026-05-08)
```bash
TOKEN=<JELLYFIN_API_TOKEN>
TOKEN=*redacted*
cat > /tmp/branding.json <<'EOF'
{
"LoginDisclaimer": "Welcome to arrflix.s8n.ru — LAN-only. Be kind, rewind.",
"LoginDisclaimer": "Welcome to tv.s8n.ru — LAN-only. Be kind, rewind.",
"CustomCss": "/* Cineplex v1.0.6 — Netflix-faithful theme by MRunkehl, pinned tag (immutable on jsDelivr) */\n/* Compat: Jellyfin 10.10.7+ ; we run 10.10.3 — verified rendering 2026-05-08 */\n@import url(\"https://cdn.jsdelivr.net/gh/MRunkehl/cineplex@v1.0.6/cineplex.css\");\n\n/* Hide Cast & Crew + Guest Stars sections globally (preserved 2026-05-08) */\n#castCollapsible, #guestCastCollapsible { display: none !important; }\n",
"SplashscreenEnabled": true
}
EOF
# Note: arrflix.s8n.ru didn't have a Traefik SNI cert at apply-time, so
# we sent the request to the legacy SNI arrflix.s8n.ru and pinned its address
# we sent the request to the legacy SNI tv.s8n.ru and pinned its address
# with --resolve. Either form is fine once both names have certs.
curl -sS --resolve arrflix.s8n.ru:443:192.168.0.100 \
curl -sS --resolve tv.s8n.ru:443:192.168.0.100 \
-X POST \
-H "X-Emby-Token: $TOKEN" \
-H "Content-Type: application/json" \
--data-binary @/tmp/branding.json \
https://arrflix.s8n.ru/System/Configuration/branding
https://tv.s8n.ru/System/Configuration/branding
# expect: HTTP 204 (got HTTP 204 — applied)
```
@ -145,15 +136,15 @@ curl -sS --resolve arrflix.s8n.ru:443:192.168.0.100 \
```bash
# 1. Admin endpoint — confirms the new CustomCss is stored.
curl -sS --resolve arrflix.s8n.ru:443:192.168.0.100 \
curl -sS --resolve tv.s8n.ru:443:192.168.0.100 \
-H "X-Emby-Token: $TOKEN" \
https://arrflix.s8n.ru/System/Configuration/branding | python3 -m json.tool
https://tv.s8n.ru/System/Configuration/branding | python3 -m json.tool
# Result: HTTP 200, contains the Cineplex @import + cast/crew hide rule.
# 2. Anonymous endpoint the SPA reads at runtime — confirms what every
# browser will pull before login.
curl -sS --resolve arrflix.s8n.ru:443:192.168.0.100 \
https://arrflix.s8n.ru/Branding/Configuration | python3 -m json.tool
curl -sS --resolve tv.s8n.ru:443:192.168.0.100 \
https://tv.s8n.ru/Branding/Configuration | python3 -m json.tool
# Result: HTTP 200, identical CustomCss to admin endpoint. ✓
# 3. The CSS asset itself on jsDelivr (sanity-check the network path).
@ -162,7 +153,7 @@ curl -sSI "https://cdn.jsdelivr.net/gh/MRunkehl/cineplex@v1.0.6/cineplex.css" |
# cache-control: public, max-age=31536000, immutable. ✓
# 4. SPA shell still routes (nav not broken).
curl -sSI --resolve arrflix.s8n.ru:443:192.168.0.100 https://arrflix.s8n.ru/ | head -1
curl -sSI --resolve tv.s8n.ru:443:192.168.0.100 https://tv.s8n.ru/ | head -1
# Result: HTTP/2 302 → /web/. ✓
```
@ -170,7 +161,7 @@ curl -sSI --resolve arrflix.s8n.ru:443:192.168.0.100 https://arrflix.s8n.ru/ | h
by the JS bundle from `/Branding/Configuration`, not inlined into
`index.html`. So `curl /` won't grep-match. The valid JSON at
`/Branding/Configuration` is the API-level confirmation. Final visual
check is a hard browser reload (Ctrl-Shift-R) on `https://arrflix.s8n.ru`
check is a hard browser reload (Ctrl-Shift-R) on `https://tv.s8n.ru`
(or `https://arrflix.s8n.ru` once its cert is up) from the LAN — owner
will do this.
@ -383,135 +374,6 @@ hide, ARRFLIX logo override, Quick Connect hide, Settings drawer hide,
header icon hide) preserved verbatim. Same race rule applies — this is
the last branding POST in the sequence.
### 3e. ElegantFin migration with ARRFLIX recolor (2026-05-08, current)
Later on 2026-05-08, the active theme was migrated **from Cineplex to
ElegantFin v25.12.31** while preserving the ARRFLIX brand: Netflix-red
`#E50914` accent overrides over ElegantFin's default Jellyseerr-blue/
violet palette, plus the existing ARRFLIX wordmark logo. The owner had
seen the demo at <https://lscambo13.github.io/ElegantFin/>, liked
ElegantFin's polished browsing UI more than Cineplex's purer Netflix
fidelity, and asked for the swap with the brand colour kept intact.
**Snapshot tag for rollback (committed and pushed before any change):**
`snapshot-2026-05-08-pre-elegantfin`. Captures `branding.json`,
`index.html`, `docker-compose.yml`, all per-user `displayprefs-*.json`,
`users.json`, `libraries.json`, plus `RESTORE.md` with three concrete
rollback commands. Located at `snapshots/2026-05-08-pre-elegantfin/`.
**ElegantFin tag pinned: `v25.12.31`** (latest tag at migration time;
list resolved via `git ls-remote --tags https://github.com/lscambo13/ElegantFin.git`).
jsDelivr serves tagged refs immutably with year-long cache TTL — same
no-surprise-update guarantee we had on `cineplex@v1.0.6`. To opt into
upstream churn, edit the URL to `@main`; to pin a different tag, edit
the version segment.
**ElegantFin import:**
```css
@import url("https://cdn.jsdelivr.net/gh/lscambo13/ElegantFin@v25.12.31/Theme/ElegantFin-jellyfin-theme-build-latest-minified.css");
```
**Accent variables overridden (ARRFLIX recolor block).** ElegantFin
declares its accent palette through CSS custom properties at `:root`.
Eight variables were identified by grepping the minified theme for
`--[a-z]*` definitions and inspecting their default values; all eight
are remapped to `#E50914` (or its `rgba()` form for alpha variants):
| Variable | ElegantFin default | ARRFLIX value | What it controls |
|---|---|---|---|
| `--uiAccentColor` | `rgb(117 111 226)` (violet) | `#E50914` | Primary UI accent — most surfaces |
| `--activeColor` | `rgb(119,91,244)` (violet) | `#E50914` | Active / focused state highlights |
| `--activeColorAlpha` | `rgba(119,91,244,.9)` | `rgba(229, 9, 20, 0.9)` | Same with alpha — hover overlays |
| `--osdSeekBarPlayedColor` | `var(--textColor)` (white) | `#E50914` | Played portion of the video scrubber |
| `--checkboxCheckedBgColor` | `rgb(79,70,229)` (indigo) | `#E50914` | Checked checkboxes (settings, lib pickers) |
| `--highlightOutlineColor` | `rgb(37,99,235)` (blue) | `#E50914` | Focus / highlight outlines on cards |
| `--btnSubmitColor` | `rgb(61,54,178)` (indigo) | `#E50914` | "Submit" button background |
| `--btnSubmitBorderColor` | `rgb(117 111 226)` (violet) | `#E50914` | "Submit" button border |
Override block:
```css
:root {
--uiAccentColor: #E50914 !important;
--activeColor: #E50914 !important;
--activeColorAlpha: rgba(229, 9, 20, 0.9) !important;
--osdSeekBarPlayedColor: #E50914 !important;
--checkboxCheckedBgColor: #E50914 !important;
--highlightOutlineColor: #E50914 !important;
--btnSubmitColor: #E50914 !important;
--btnSubmitBorderColor: #E50914 !important;
}
```
Variables deliberately NOT changed:
- `--osdSeekBarThumbColor: white` — kept the explicit white-thumb rule
from §3d (white thumbs read as a neutral position indicator, not as
brand colour). The slider-thumb override in this doc's §3d still
applies.
- `--drawerColor`, `--headerColor` — kept ElegantFin's translucent
blur over its dark-blue surface; these are structural, not accent.
- `--borderColor`, `--textColor` — typography / structure, not accent.
**Logo selectors used.** ElegantFin does NOT define rules for the two
ARRFLIX logo selectors (verified by grepping the minified theme for
`adminDrawerLogo` and `pageTitleWithLogo` — zero matches), so the same
override skeleton from §3a/§3b is re-applied verbatim against the
ElegantFin base:
```css
.adminDrawerLogo img {
/* <img> in admin sidebar drawer — content: replaces src */
content: url("data:image/png;base64,<...ARRFLIX wordmark...>") !important;
}
.pageTitleWithLogo {
/* <div> masthead on dashboard + login — bg image only, no content: */
background-image: url("data:image/png;base64,<...ARRFLIX wordmark...>") !important;
}
```
The data-URL bytes are byte-for-byte identical to the Cineplex-era
override (extracted from the snapshot's `branding.json` and re-inlined
into the new `CustomCss` payload). Both selectors are still split-rule
form (per the §3a/§3b lesson — never combine `content:` and
`background-image:` on the same selector).
**Preserved blocks** (every custom rule from the Cineplex era was
re-applied on top of ElegantFin):
- `#castCollapsible, #guestCastCollapsible { display: none }` — cast/crew sections hidden
- `.btnQuick { display: none }` — Quick Connect login button hidden
- `.headerSyncButton`, `.headerCastButton`, `.headerUserButton` — top-right header icons hidden (§3c)
- `.MuiSlider-thumb` + variants — white scrubber/volume thumbs (§3d)
- `:root { --primary-background-color: #000000; --background-color: #000000; }` and the wrapper-element rules — pure black bg (§3d)
- `mypreferencesmenu` selectors — Settings drawer entry hidden
- `.countIndicator { display: none }` — unwatched-episode count badges hidden
- `.adminDrawerLogo img` / `.pageTitleWithLogo` — ARRFLIX wordmark override
- `LoginDisclaimer``"Welcome to ARRFLIX - Private invite only service"` preserved
- `SplashscreenEnabled: true` — preserved
**Verification (executed 2026-05-08):**
- POST to `/System/Configuration/branding` → HTTP 204
- GET on `/Branding/Configuration` → no Cineplex `@import`, ElegantFin
`@import` present and pinned to `v25.12.31`, ARRFLIX logo data URL
intact on both selectors, all preserved blocks intact, all eight
accent variable overrides present
- HEAD on `https://arrflix.s8n.ru/` → HTTP 302 (Traefik redirect to
`web/`, baseline behaviour — proxy still serving)
**Operational notes:**
- The bind-mounted `/web/index.html` was NOT touched (sibling work owns
that file via the index-patcher). All visual changes ride on
`CustomCss` via the public `/Branding/Configuration` consumer + the
authenticated `/System/Configuration/branding` writer.
- No container restart, no `docker compose` action, no Traefik change.
- Same race rule from §3b applies — the branding POST in this migration
was the **last** POST in the sequence.
**Rollback** — see `snapshots/2026-05-08-pre-elegantfin/RESTORE.md`,
or in one shot: `git checkout snapshot-2026-05-08-pre-elegantfin --
snapshots/2026-05-08-pre-elegantfin/branding.json` then POST it back
to `/System/Configuration/branding`.
---
## 4. Multi-user UX prep
@ -548,7 +410,7 @@ to `/System/Configuration/branding`.
> friend account that will exist later.
```bash
TOKEN=<JELLYFIN_API_TOKEN>
TOKEN=*redacted*
TVSHOWS_ID=767bffe4f11c93ef34b805451a696a4e
# 1. Create the user (auth header REQUIRED — admin token).
@ -760,11 +622,11 @@ no surprise breakage. To opt into upstream changes:
```bash
# Move from immutable tag to floating @main (pulls future commits;
# jsDelivr cache TTL is up to 7d for floating refs).
curl -sS --resolve arrflix.s8n.ru:443:192.168.0.100 \
curl -sS --resolve tv.s8n.ru:443:192.168.0.100 \
-X POST -H "X-Emby-Token: $TOKEN" \
-H "Content-Type: application/json" \
-d '{"CustomCss": "@import url(\"https://cdn.jsdelivr.net/gh/MRunkehl/cineplex@main/cineplex.css\");\n#castCollapsible, #guestCastCollapsible { display: none !important; }", "LoginDisclaimer": "Welcome to arrflix.s8n.ru — LAN-only. Be kind, rewind.", "SplashscreenEnabled": true}' \
https://arrflix.s8n.ru/System/Configuration/branding
-d '{"CustomCss": "@import url(\"https://cdn.jsdelivr.net/gh/MRunkehl/cineplex@main/cineplex.css\");\n#castCollapsible, #guestCastCollapsible { display: none !important; }", "LoginDisclaimer": "Welcome to tv.s8n.ru — LAN-only. Be kind, rewind.", "SplashscreenEnabled": true}' \
https://tv.s8n.ru/System/Configuration/branding
```
Or just ask each user to hard-reload — their browser cache is the common
@ -783,16 +645,16 @@ Replace the `@import` line:
```bash
# Back to ElegantFin (Jellyseerr-style):
curl -sS --resolve arrflix.s8n.ru:443:192.168.0.100 \
curl -sS --resolve tv.s8n.ru:443:192.168.0.100 \
-X POST -H "X-Emby-Token: $TOKEN" -H "Content-Type: application/json" \
-d '{"CustomCss": "@import url(\"https://cdn.jsdelivr.net/gh/lscambo13/ElegantFin@v25.12.31/Theme/ElegantFin-jellyfin-theme-build-latest-minified.css\");\n#castCollapsible, #guestCastCollapsible { display: none !important; }", "LoginDisclaimer": "Welcome to arrflix.s8n.ru — LAN-only. Be kind, rewind.", "SplashscreenEnabled": true}' \
https://arrflix.s8n.ru/System/Configuration/branding
-d '{"CustomCss": "@import url(\"https://cdn.jsdelivr.net/gh/lscambo13/ElegantFin@v25.12.31/Theme/ElegantFin-jellyfin-theme-build-latest-minified.css\");\n#castCollapsible, #guestCastCollapsible { display: none !important; }", "LoginDisclaimer": "Welcome to tv.s8n.ru — LAN-only. Be kind, rewind.", "SplashscreenEnabled": true}' \
https://tv.s8n.ru/System/Configuration/branding
# To vanilla Jellyfin (clear everything):
curl -sS --resolve arrflix.s8n.ru:443:192.168.0.100 \
curl -sS --resolve tv.s8n.ru:443:192.168.0.100 \
-X POST -H "X-Emby-Token: $TOKEN" -H "Content-Type: application/json" \
-d '{"CustomCss": "", "LoginDisclaimer": "", "SplashscreenEnabled": false}' \
https://arrflix.s8n.ru/System/Configuration/branding
https://tv.s8n.ru/System/Configuration/branding
```
Or in the UI: Dashboard → General → edit / clear "Custom CSS code" →
@ -842,9 +704,9 @@ When the friend gets their account, walk them through this **once**:
General → Quick Connect).
- [ ] Configure SMTP for self-serve password reset (currently admin-only).
- [ ] Get Traefik to issue a SNI cert for `arrflix.s8n.ru` so the curl
examples don't need `--resolve arrflix.s8n.ru:443:192.168.0.100`. Until
examples don't need `--resolve tv.s8n.ru:443:192.168.0.100`. Until
then, both names point to the same backend on `192.168.0.100` but
only `arrflix.s8n.ru` has a valid cert.
only `tv.s8n.ru` has a valid cert.
- [ ] Watch [Cineplex commits](https://github.com/MRunkehl/cineplex/commits/main)
monthly; if a `v1.0.7` lands and looks safe, bump the pin.
- [ ] Add a 2nd library (movies are mounted but the server may have an

View file

@ -968,7 +968,7 @@ used at library creation.)
### 12.1 Creating libraries via API
```bash
TOKEN=<JELLYFIN_API_TOKEN>
TOKEN=*redacted*
H="-H \"X-Emby-Token: ${TOKEN}\""
B="https://arrflix.s8n.ru"

View file

@ -55,7 +55,7 @@ information needed to scope styles is simply not in the DOM.
### Why approach #3 fails
`GET /Library/VirtualFolders` (auth `X-Emby-Token: <JELLYFIN_API_TOKEN>`) returns
`GET /Library/VirtualFolders` (auth `X-Emby-Token: *redacted*`) returns
`LibraryOptions` containing only metadata/scan/subtitle settings. No `CustomCss`, no `Theme`, no
`Branding` per library. The single global CustomCss field at `/System/Configuration/branding` is the
only knob the server exposes.

View file

@ -1,261 +0,0 @@
# 11 — NeutralFin Render Audit
Status: **read-only audit**, executed 2026-05-08 against
`https://arrflix.s8n.ru` (Jellyfin 10.10.3 on nullstone). Owner reported
the live render "doesn't look as good as it should" relative to the
NeutralFin demo screenshots. Scope: identify why the current `CustomCss`
+ inline critical-path `<style>` block fail to deliver the polished
NeutralFin aesthetic. **No fixes applied. No state mutated.**
> **Headline finding (must read first).** The audit was commissioned
> against the **live NeutralFin render**, but at audit time
> `/Branding/Configuration` returned a `CustomCss` whose `@import` is
> `MRunkehl/cineplex@v1.0.6` and whose accompanying personal-tweak
> blocks reference Cineplex / Netflix-red, **not** NeutralFin. A single
> earlier curl in the audit session momentarily showed a NeutralFin
> import block, then three follow-up cache-busted curls reverted to
> Cineplex. This matches the §3b race rule in `04-theming-and-users.md`:
> the branding endpoint takes a complete object on every POST; whichever
> POST lands last wins. **A NeutralFin payload was applied and then
> overwritten by a sibling Cineplex POST.** The render the owner is
> seeing is therefore not NeutralFin — it is Cineplex with a stale set
> of personal tweaks layered on top, plus a critical-path `<style>` in
> `index.html` that pre-paints the page in Netflix red. That mismatch
> alone is the single highest-impact root cause; everything else below
> is secondary.
---
## 1. Visual contract — what NeutralFin should look like
Sourced from <https://github.com/KartoffelChipss/NeutralFin> README and
the upstream minified CSS at
`https://cdn.jsdelivr.net/gh/KartoffelChipss/NeutralFin@1.3.0/theme/neutralfin-minified.css`.
| Aspect | NeutralFin contract |
|---|---|
| Tagline | *"a sleek black and grey color scheme for a more neutral and modern look"* |
| Lineage | Built on ElegantFin (GPL-2.0). Bundles Jellyfin Lucide icons for the modern icon set. |
| Page background | **Gradient** between `--darkerGradientPoint #131313` and `--lighterGradientPoint #1e1e1e` (0deg). NOT pure `#000`. |
| Card background | `--cardBackgroundGradient` (same two-stop dark gradient). NOT pure `#000`. |
| Header / drawer surface | `--headerColor rgba(40,40,40,0.5)` and `--drawerColor rgba(40,40,40,0.9)` — translucent over a blurred backdrop. |
| Accent (UI) | `--uiAccentColor rgb(130,130,130)`**mid-grey, not coloured**. |
| Active / focus tint | `--activeColor rgb(100,100,100)` — slightly darker grey. |
| Borders | `--borderColor rgb(71,71,71)` (mid), `--darkerBorderColor rgb(51,51,51)`, `--lighterBorderColor rgba(255,255,255,0.2)` — subtle hierarchy across cards/sections. |
| Selector bg | `--selectorBackgroundColor rgb(60,60,60)`. |
| Text | `--textColor rgb(209,213,219)` (off-white, not pure white). `--dimTextColor rgb(156,163,175)`. |
| Play button | `--btnMiniPlayColor rgb(41,154,93)` — the only saturated colour, used on play CTAs. |
| Delete button | `--btnDeleteColor rgb(169,29,29)` — saturated red, but ONLY for destructive confirms. |
| Recommended pairing | Owner enables **backdrops** in Jellyfin (`Display → Show backdrops`). The translucent header/drawer relies on having something to blur. |
| Minimum JF | Not stated. Demos shown on JF 10.11+. We're on 10.10.3 — selectors should largely match (Lucide icon refs may degrade gracefully). |
**Net visual impression:** subdued monochrome, soft gradients, mid-grey
accents, restrained borders, off-white text. The whole thing is a
*texture* of dark greys, not a flat black.
NeutralFin defines **none** of `--primary-background-color`,
`--background-color`, `--background-color-alpha`,
`--card-background-color`, or `--mui-palette-primary-main`. Those are
Jellyfin's own variables; NeutralFin lets Jellyfin's defaults pass
through and skins via its own `--darkerGradientPoint` /
`--lighterGradientPoint` / `--headerColor` / `--drawerColor` set.
---
## 2. Live state at audit time
**`/Branding/Configuration` (anon)** and
**`/System/Configuration/branding` (authed)** both return identical
payload, 25 225 chars of `CustomCss`. Theme banner comments name
**Cineplex v1.0.6**. Sole `@import` is
`https://cdn.jsdelivr.net/gh/MRunkehl/cineplex@v1.0.6/cineplex.css`.
`!important` count in CustomCss: **17**.
`#E50914` occurrences in CustomCss: **0**.
Inline critical-path `<style>` block in
`/jellyfin/jellyfin-web/index.html` (bind-mounted from
`web-overrides/index.html`, 93 lines total): forces `#000000` on
shell wrappers AND `#E50914` on `.raised, .button-submit,
.emby-button[type=submit], button[type=submit]`. **1 occurrence of
`#E50914`**, **8 occurrences of `ARRFLIX`** (title, shim, comments).
ARRFLIX wordmark PNG embedded in CustomCss: **235 × 85 px**,
aspect ratio **2.765**.
---
## 3. Drift table — every rule in current CustomCss + index.html
For each block, classify as KEEP (compatible with NeutralFin),
DROP (legacy / harmful), or MODIFY (needs adjustment for NeutralFin).
Assumes the owner's intent was to be on NeutralFin.
| # | Source | Block | Classification | Reason |
|---|---|---|---|---|
| 1 | CustomCss | `@import cineplex@v1.0.6` | **DROP** | Wrong theme entirely. Owner wants NeutralFin. Replace with `@import KartoffelChipss/NeutralFin@1.3.0/theme/neutralfin-minified.css`. |
| 2 | CustomCss | `#castCollapsible, #guestCastCollapsible { display:none }` | **KEEP** | Personal preference, theme-agnostic. NeutralFin doesn't redefine these. |
| 3 | CustomCss | `.adminDrawerLogo img { content: url(<ARRFLIX 235×85 PNG>) }` | **KEEP** | NeutralFin defines no rule for this selector (verified). Override stands. Split-rule form (per §3a) preserved. |
| 4 | CustomCss | `.pageTitleWithLogo { background-image: url(<same PNG>) }` | **KEEP** | Same; NeutralFin doesn't touch this selector. |
| 5 | CustomCss | `.btnQuick { display:none }` | **KEEP** | Server-side disable in §4g still in effect. CSS belt-and-braces is fine on any theme. |
| 6 | CustomCss | `.headerSyncButton/.headerCastButton/.headerUserButton { display:none }` | **KEEP** | NeutralFin sets `width/height/border` on `.headerUserButton` — those rules become moot under `display:none`, no conflict. |
| 7 | CustomCss | `.MuiSlider-thumb { color/bg/border:#fff }` (+ hover halo) | **KEEP**, but reconsider | NeutralFin doesn't theme MUI sliders. White thumbs work, but they're a Cineplex-era decision when the rest of the chrome was Netflix-red/white/black. Against a monochrome grey theme, mid-grey thumbs would read more native. Low priority. |
| 8 | CustomCss | `:root { --primary-background-color:#000 !important; --background-color:#000 !important }` | **DROP** | **High-impact harm.** NeutralFin's whole aesthetic depends on the page background showing the gradient between `#131313` and `#1e1e1e`. Forcing `#000` flattens that gradient to a single pure black, killing the depth NeutralFin was designed to deliver. Owner literally cannot see NeutralFin's intent while these vars are clamped. |
| 9 | CustomCss | `html, body, .preload, .skinBody, .mainDrawerHandle { background-color:#000 !important }` | **DROP** | Same — clobbers NeutralFin's gradient surface. NeutralFin paints page bg via the gradient applied at body / wrapper level; this rule forces solid black underneath. |
| 10 | CustomCss | `.skinHeader/.skinHeader.semiTransparent/.skinHeader-withBackground/.mainAnimatedPages/#reactRoot/.dashboardDocument { background:#000 !important }` | **DROP** | Same. Notably `.skinHeader.semiTransparent` is the surface NeutralFin's `--headerColor rgba(40,40,40,0.5)` translucency renders OVER. Forcing `#000` underneath defeats the blur/translucency effect — the header becomes a flat black bar instead of a glassy panel. |
| 11 | CustomCss | `mypreferencesmenu` :has() block | **KEEP** | Personal tweak, theme-agnostic. JF 10.10.3 supports `:has()` in modern browsers; if a user is on Firefox <121 they'll see the link, but no harm to NeutralFin. |
| 12 | CustomCss | `.countIndicator { display:none }` | **KEEP**, but note | NeutralFin sets `background:#1f50bd; border:var(--defaultLighterBorder)` on this selector. Hiding it is fine and is what owner asked for; the NeutralFin styling becomes irrelevant under `display:none`. |
| 13 | index.html `<style>` | `:root { --primary-background-color:#000; --background-color:#000 }` (no `!important`) | **MODIFY (DROP the var lines)** | Same harm as row 8 but in a sneakier place: it's pre-bundle, paints before CustomCss arrives, then CustomCss row 8 keeps it pinned post-bundle. For NeutralFin to look right, both need to go. |
| 14 | index.html `<style>` | `html, body, .preload, .skinBody, .skinHeader, #reactRoot, .mainAnimatedPages { background:#000 !important; color:#fff !important }` | **MODIFY** | Drop `.skinHeader` from the selector list (so NeutralFin's translucent header isn't pre-painted black) and consider dropping the wrapper bg overrides entirely. The `color:#fff` is also more saturated than NeutralFin's off-white `rgb(209,213,219)` — fine for pre-bundle anti-flash but needs to NOT be `!important` post-bundle. The `!important` here outranks NeutralFin's inherited text colour. |
| 15 | index.html `<style>` | `.raised, .button-submit, .emby-button[type=submit], button[type=submit] { background:#E50914 !important; color:#fff !important }` | **DROP** | **Critical.** NeutralFin is monochrome — the play CTA is green (`--btnMiniPlayColor`), submits use grey accent, only `--btnDeleteColor` is red and only on destructive confirms. This block paints **every submit button Netflix-red**, including login → Sign In, settings → Save, library → Add. Owner did not ask for that on NeutralFin. This is the most jarring single visual conflict. |
| 16 | index.html `<script>` | `nukeSettings()` MutationObserver + `setInterval(...,1000)` | **KEEP** | Targets `mypreferencesmenu` only; doesn't mutate styles or layout. Does fire on every DOM mutation (could be tens per second on rich pages) but the work is one querySelectorAll scoped to a narrow attribute selector. No measurable layout thrash on a non-loaded page; on heavy lists it's the sort of thing to profile but not a "looks bad" cause. |
| 17 | index.html `<script>` | `lockTitle/lockFavicon` head observer + interval | **KEEP** | Cosmetic, unrelated to render quality. |
---
## 4. Variable conflict report
| NeutralFin variable | Default | Overridden by us? | Effect |
|---|---|---|---|
| `--darkerGradientPoint` | `#131313` | no | Gradient bottom intact … but masked by row 9 `body{bg:#000}` |
| `--lighterGradientPoint` | `#1e1e1e` | no | Gradient top intact … masked same way |
| `--headerColor` | `rgba(40,40,40,0.5)` | no | Translucent header colour intact … but row 10 paints `.skinHeader{bg:#000}` underneath, so the alpha composes against pure black instead of the gradient. Header reads flatter than NeutralFin intends. |
| `--drawerColor` | `rgba(40,40,40,0.9)` | no | OK — drawer bg unaffected by the wrapper-element rules. |
| `--borderColor` | `rgb(71,71,71)` | no | OK |
| `--uiAccentColor` | `rgb(130,130,130)` | no in CustomCss; **YES** indirectly via index.html row 15 (every submit button forced red) | Submit buttons should be grey-accented; instead they are `#E50914`. |
| `--activeColor` | `rgb(100,100,100)` | no | OK |
| `--textColor` | `rgb(209,213,219)` | partially — index.html row 14 sets `color:#fff !important` on body | Text is full white instead of the off-white NeutralFin uses. Subtle but cumulative. |
| `--btnMiniPlayColor` | `rgb(41,154,93)` | no | Play CTA still green, OK. |
| `--btnDeleteColor` | `rgb(169,29,29)` | no | Delete confirms still red, OK. |
| Jellyfin `--primary-background-color` | (Jellyfin default `#101010`-ish) | **YES** — row 8 + row 13 → `#000` | NeutralFin doesn't override this var; NeutralFin paints the gradient directly on `body`. Forcing `--primary-background-color:#000` doesn't break NeutralFin's body gradient (NeutralFin doesn't read this var) BUT the `body{bg:#000 !important}` rule that lives next to it DOES, because it sets the body bg directly and beats NeutralFin's lower-specificity body rule. |
| Jellyfin `--background-color` | Jellyfin default | **YES** — row 8 + row 13 → `#000` | Same — variable override harmless on its own; the wrapper rule next door is the real damage. |
| `--mui-palette-primary-main` | (MUI default) | no | OK; sliders/checkboxes keep MUI palette. |
---
## 5. Logo aspect ratio
ARRFLIX wordmark PNG: **235 × 85 px**, aspect **2.765 : 1**.
NeutralFin (and Cineplex) target three logo containers:
- `.adminDrawerLogo img` — admin sidebar drawer. Inherits sidebar
width (~240 px on desktop). 235 × 85 fits naturally; replaces
`<img>` source via `content:`. **Match: YES.**
- `.pageTitleWithLogo` — masthead `<div>` on dashboard / login pages.
In NeutralFin this `<div>` is sized by `var(--appBarHeight) 5em`
(header height) and the `background-image` is laid out with
`background-size: contain` (NeutralFin / ElegantFin convention).
At 5 em ≈ 80 px header height a 235 × 85 image will render at
~221 × 80 — fits the header band cleanly. **Match: YES**, no
squish, no clip.
- `.detailLogo` — clear-logo on item detail pages (movies / shows).
NeutralFin sizes this at `width:40%; height:25vh; background-position:bottom`
with `background-size:contain` — designed for tall, near-square
clear logos. A 2.765:1 wordmark will render small (height-limited
by the 25vh box only at very narrow viewports; at 1080p it's
width-limited at 40% = 768 px and height settles at ~278 px, well
under 25vh = 270 px). Acceptable, no distortion. **Match: YES.**
**Verdict: Logo aspect ratio is fine. Not a render-quality root
cause.** A 235 × 85 wordmark is on the wide end of typical Jellyfin
custom logos but fits every container cleanly because both NeutralFin
and Cineplex use `background-size: contain` on the masthead.
---
## 6. Recommended fix list (impact-ranked, top = biggest visual win)
> **Read-only audit. None of these have been applied.** Owner sign-off
> required before any branding POST.
1. **Apply NeutralFin (currently NOT applied).** Replace the
`@import` line in CustomCss to point at
`https://cdn.jsdelivr.net/gh/KartoffelChipss/NeutralFin@1.3.0/theme/neutralfin-minified.css`.
(Verify the live `Branding/Configuration` reflects this *after* the
POST, and that no sibling agent is racing the endpoint — see §3b
operational rule. Make this POST the LAST POST in the sequence.)
2. **Drop the pure-black background overrides** in CustomCss
(drift-table rows 8, 9, 10). NeutralFin's whole texture is the
`#131313 → #1e1e1e` gradient; clamping it to `#000` flattens it
and is the single biggest cause of the "not as good as it should"
feel.
3. **Drop `#E50914` from the index.html critical-path `<style>`**
(drift-table row 15). On NeutralFin, every submit button suddenly
being Netflix-red is the single most jarring visual conflict.
Also drop `.skinHeader` from the wrapper bg list (row 14) and
the `--primary-background-color/--background-color #000`
declarations (row 13). What stays in the critical-path `<style>`
should be: `html, body { background:#0e0e0e }` (close enough to
NeutralFin's gradient midpoint to avoid pre-bundle flash without
clamping the gradient post-bundle) and `color:#d1d5db` (the
off-white NeutralFin uses) — both WITHOUT `!important` so the
theme can take over once it loads.
4. **Reconsider the white slider thumbs** (row 7) once #13 land. If
the owner still finds them too "Netflix" against a grey theme,
change to `currentColor` or `var(--uiAccentColor)`. Low priority,
purely taste.
5. **Audit the `!important` count** post-fix. Currently 17; once the
black-bg wrapper rules drop, the count falls to ~10, all of which
are legitimate (display:none overrides, logo content: replacements,
slider thumb forces). NeutralFin's hover/focus states will then
fire correctly because no `!important` rule is masking them.
---
## 7. Rollback note
If owner says "revert everything I had before the audit-driven fixes":
```bash
git checkout snapshot-2026-05-08-pre-elegantfin -- \
snapshots/2026-05-08-pre-elegantfin/branding.json
# then POST that file's contents to /System/Configuration/branding
# (full restore command in snapshots/2026-05-08-pre-elegantfin/RESTORE.md)
```
That snapshot captures the **Cineplex era** state — the CustomCss in it
is the same Cineplex import that's live RIGHT NOW (modulo personal-tweak
appendices that were added after the snapshot). It does NOT contain a
NeutralFin import, because NeutralFin was never persisted long enough
to enter the canonical history; the §3e ElegantFin migration block in
`04-theming-and-users.md` documents an *intended* state that the owner
had asked for but which a sibling Cineplex POST has since silently
reverted.
For a clean reset to vanilla Jellyfin (no theme at all) before
re-trying NeutralFin:
```bash
curl -sS -X POST -H "X-Emby-Token: $TOKEN" \
-H "Content-Type: application/json" \
-d '{"CustomCss":"","LoginDisclaimer":"Welcome to ARRFLIX - Private invite only service","SplashscreenEnabled":true}' \
https://arrflix.s8n.ru/System/Configuration/branding
```
---
## 8. What was NOT touched during this audit
- No POST to `/System/Configuration/branding`.
- No edit to `web-overrides/index.html` or the bind-mounted
`/jellyfin/jellyfin-web/index.html`.
- No `docker compose` action, no container restart.
- No git commit on `snapshots/`, no tag movement.
- Read-only over SSH; only `docker exec jellyfin sh -c '...'` shell
invocations, all bounded to `wc -l` / `head` / `grep -c`.
---
## 9. Sign-off
- **Auditor:** s8n (audit pass, 2026-05-08)
- **Live theme at audit time:** Cineplex v1.0.6 (despite doc 04 §3e
claiming ElegantFin + ARRFLIX recolor; despite owner believing the
state is NeutralFin)
- **Doc 04 §3e accuracy:** stale — needs an §3f addendum after fixes
documenting the NeutralFin migration and the race-loss that hid it.
- **Next step:** owner reviews this doc, decides whether to apply the
fix list in §6. No work to be done on the live server until that
review.

View file

@ -1,174 +0,0 @@
# 12 — Jellyfin DEV instance for theme experimentation
A second Jellyfin container, `jellyfin-dev`, runs alongside prod on
nullstone. Same media library (read-only), separate config/cache/users,
separate domain. LAN-only by design — you can break it freely without
real users (marco, house, guest, 5) noticing.
---
## Architecture diff
| Aspect | Prod | Dev |
|-------------------|-------------------------------------|-------------------------------------------|
| Container | `jellyfin` | `jellyfin-dev` |
| Image | `jellyfin/jellyfin:10.10.3` | `jellyfin/jellyfin:10.10.3` (must match) |
| Compose path | `/opt/docker/jellyfin/` | `/opt/docker/jellyfin-dev/` |
| Config dir | `/home/docker/jellyfin/{config,cache}` | `/home/docker/jellyfin-dev/{config,cache}` |
| Media mount | `/home/user/media:/media:ro` | `/home/user/media:/media:ro` (SAME, RO) |
| Domain | `arrflix.s8n.ru` | `dev.arrflix.s8n.ru` |
| Pi-hole DNS | `dns.hosts` in pihole.toml | `dns.hosts` in pihole.toml (added 2026-05-08) |
| Traefik router | `Host(arrflix.s8n.ru)` | `Host(dev.arrflix.s8n.ru)` |
| Cert | LE DNS-01 (Gandi) | LE DNS-01 (auto-issued on first request) |
| Middleware | `security-headers@file` only | `security-headers@file,no-guest@file` |
| WAN exposure | Yes during WAN window (doc 09) | NEVER — LAN-only forever |
| Internal port | `8096` | `8096` |
| User | `1000:1000` | `1000:1000` |
| `userns_mode` | `host` | `host` |
| index.html shim | Bind-mounted (doc 10) | None (vanilla shell — clean theme canvas) |
| Branding/auth | Configured | Empty — first-run wizard required |
The compose file lives in this repo at `compose-dev/docker-compose.yml`
and is deployed to nullstone at `/opt/docker/jellyfin-dev/docker-compose.yml`.
---
## How to use
1. Open `https://dev.arrflix.s8n.ru` from any LAN/tailnet box. First visit hits the
first-run wizard — create an admin user (use any throwaway name; nothing
shared with prod).
2. Add libraries pointing at the same paths prod uses:
- `/media/movies`
- `/media/tv`
The library ROOTS are shared (read-only); dev will rescrape independently
into its own `library.db`. That's intentional — dev is a clean slate.
3. Apply a theme via Branding API or via the SPA shim (doc 10) by dropping
files into `/opt/docker/jellyfin-dev/web-overrides/` and adding the same
bind-mount pattern as prod (currently absent for a clean canvas).
4. Test, watch, break. Prod remains untouched on `arrflix.s8n.ru`.
---
## Theme workflow (dev → prod)
When a dev theme is "shipped":
1. **Export branding** from dev:
```bash
curl -k -H "X-Emby-Token: $DEV_TOKEN" \
https://dev.arrflix.s8n.ru/Branding/Configuration > /tmp/branding.json
```
2. **POST to prod**:
```bash
curl -k -X POST \
-H "X-Emby-Token: <JELLYFIN_API_TOKEN>" \
-H "Content-Type: application/json" \
--data @/tmp/branding.json \
https://arrflix.s8n.ru/System/Configuration/branding
```
3. If the theme involves SPA-shim files (custom JS/CSS), `rsync` them from
`dev:/opt/docker/jellyfin-dev/web-overrides/` to
`prod:/opt/docker/jellyfin/web-overrides/` and hot-reload prod via the
bind-mount (no container restart needed for read-only mounts on file
change — Jellyfin will serve the new file on next request).
Auth tokens for dev are local to the dev instance — they'll be issued by
the dev wizard. They DO NOT cross over.
---
## Reset / wipe dev
When experiments make a mess:
```bash
ssh user@192.168.0.100
cd /opt/docker/jellyfin-dev
docker compose down
sudo rm -rf /home/docker/jellyfin-dev/config/* /home/docker/jellyfin-dev/cache/*
# (use the privileged-userns-host bypass if no sudo:
# docker run --rm --privileged --userns=host -v /home/docker:/h alpine \
# sh -c 'rm -rf /h/jellyfin-dev/config/* /h/jellyfin-dev/cache/*')
docker compose up -d
```
First-run wizard reappears. The media library is intact (read-only mount,
unaffected).
---
## LAN-only enforcement
`no-guest@file` middleware (defined in `/opt/docker/traefik/config/dynamic.yml`)
restricts source IPs to:
- `127.0.0.0/8`
- `192.168.0.0/24` (LAN)
- `100.64.0.1/32` onyx, `100.64.0.2/32` nullstone, `100.64.0.4/32` office (tailnet)
- `82.22.5.233/32` YOU500 home IP
- `172.20.0.0/24` docker proxy gateway
Anyone outside that list trying `https://dev.arrflix.s8n.ru` from the WAN
gets a Traefik 403. Even if a guest tailnet node (100.64.0.3 friend GPU)
hits dev, no-guest blocks them — only `tag:admin` and `tag:infra` are
allowed.
There is **no plan** to expose dev publicly. If you need to test something
WAN-shaped, do it on prod inside the WAN window (doc 09) — never widen
dev's allowlist.
---
## Risks and non-risks
- **Read-only media mount.** Dev cannot write to `/home/user/media`.
Theme experiments cannot accidentally rename, delete or scramble files.
- **Separate library.db.** Dev rescrapes from scratch. If a metadata
experiment in dev produces bad results, it never touches prod metadata.
- **Same Traefik instance.** Both routers share the proxy network and the
one Traefik. A misconfigured label on dev could *theoretically* shadow
prod's router, but the rules are `Host(dev.arrflix.s8n.ru)` vs
`Host(arrflix.s8n.ru)` — disjoint. Sanity-check after any compose edit
with `curl -kI https://arrflix.s8n.ru/`.
- **Same image tag.** Bumping prod to a new Jellyfin version means
bumping dev too; do prod first, then sync dev. Never test a version
bump on dev and forget to mirror prod — the API surface might drift.
- **No shared sessions.** Tokens, users, watch progress, playlists are
100% isolated. A test admin in dev cannot act on prod, and vice versa.
---
## Quick reference
```
# Status
ssh user@192.168.0.100 'docker ps --filter name=jellyfin'
# Logs
ssh user@192.168.0.100 'docker logs jellyfin-dev --tail 100 -f'
# Restart
ssh user@192.168.0.100 'cd /opt/docker/jellyfin-dev && docker compose restart'
# Stop / start
ssh user@192.168.0.100 'cd /opt/docker/jellyfin-dev && docker compose down'
ssh user@192.168.0.100 'cd /opt/docker/jellyfin-dev && docker compose up -d'
# Health check from onyx
curl -kI https://dev.arrflix.s8n.ru
# expect HTTP/2 302, location: web/
```
---
## DNS pin path used
The dev hostname was added to Pi-hole's `dns.hosts` array in
`/opt/docker/pihole/etc-pihole/pihole.toml` (alongside the existing
LAN-only entries) and Pi-hole was restarted to pick up the change.
The legacy `custom.list` file is still present but is no longer the
authoritative source — `dns.hosts` in `pihole.toml` is what
`pihole-FTL` actually consults.
If `dev.arrflix.s8n.ru` ever fails to resolve, restart Pi-hole and
re-check the `dns.hosts` array.

View file

@ -1,372 +0,0 @@
# 13 — Optimization Audit (Read-Only)
> Status: **read-only audit**, executed 2026-05-08 against
> `https://arrflix.s8n.ru` (Jellyfin 10.10.3 on nullstone). Scope: scan
> for performance, capacity, reliability, and ops-hygiene risks. **No
> fixes applied. No state mutated. No container restarts.**
Audit ran ~25 minutes wall. Inputs: Jellyfin REST API (auth
`X-Emby-Token: 76858153…f8b1`), `docker exec jellyfin`, `docker logs
{traefik,jellyfin} --since 1h/6h/24h`, host `free`, `df`, `uptime`,
`nvidia-smi`, on-disk Jellyfin XML configs.
---
## Executive summary
1. **Host is under serious memory pressure right now.** `uptime` shows
load average **11.40 / 9.59 / 6.19** on a 12-core box, **6.8 GiB of
swap is in use** (out of 24 GiB), and `/home` is **90 % full
(40 GiB free of 399 GiB)**. Jellyfin itself is fine
(522 MiB / 31 GiB cap, no restarts), but the host it lives on is
loaded enough that any media ingest at scale will start swap-thrashing.
This is the single biggest risk to playback latency.
2. **GPU transcode is dead and confirmed dead.** `nvidia-smi` fails on
host, `lsmod | grep nvidia` returns empty, `/dev/nvidia*` does not
exist. `EnableHardwareEncoding=true` and `HardwareAccelerationType=none`
in `encoding.xml` is harmless but misleading — the toggle is on, but
the type selector is `none`, so every transcode goes through ffmpeg
software path. Two HLS segment requests this hour returned **499**
(client cancelled mid-transcode at 6.4 s and 2.9 s wall) — that is
the playback-stalls signature.
3. **OpenSubtitles plugin is logging an error per file probed during
library scan** (102 errors in last 6 h) because `Username` and
`Password` are empty in the plugin XML. Every Scan Media Library run
tries Open Subtitles, fails on auth, logs an `ERR`, retries on the
next file. This is pure log noise + wasted RTT, not data loss, but
it bloats `/config/log` and obscures real warnings.
4. **Transcode throttling is OFF and `MaxMuxingQueueSize` is 2048**
on a CPU-only deploy that means a stalled client with high-bitrate
AV1/HEVC source will keep ffmpeg burning a full core for up to
`SegmentKeepSeconds=720`s after the client gives up. `EnableThrottling`
should be on for a CPU deploy; this would have prevented the 499s
seen above.
5. **No automated backup of `/home/docker/jellyfin/config/`.** The
Cineplex CSS, the 5 user accounts + permissions, the library
metadata, and the Open Subtitles plugin install all live in one
unprotected directory tree. The repo's `snapshots/` only captures the
pre-ElegantFin migration baseline; nothing on disk is being rotated
off-host.
---
## Findings table
Severity legend: **R** = red (acute, fix this week), **Y** = yellow
(deferred fix, document risk), **G** = green (audited, healthy, no
action). Effort: **S** ≤ 30 min, **M** half-day, **L** > 1 day.
| # | Category | Severity | Evidence | Recommendation | Effort |
|---|---|:-:|---|---|:-:|
| 01 | Host capacity | **R** | `uptime` load 11.40 / 9.59 / 6.19 on 12 cores; swap 6.8 GiB used / 24 GiB; `/home` 90 % full | Identify swap hog (likely not Jellyfin — only 522 MiB RSS); reclaim space on `/home`; budget media additions against the 40 GiB headroom | M |
| 02 | GPU transcode | **R** | `nvidia-smi` fails, no `/dev/nvidia*`, `lsmod` no nvidia mod; `HardwareAccelerationType=none` | Reinstall nvidia driver on nullstone host; once `nvidia-smi` works, add device reservation block to compose and flip `HardwareAccelerationType` to `nvenc` | L |
| 03 | Transcode throttling | **R** | `EnableThrottling=false`, `ThrottleDelaySeconds=180`, `MaxMuxingQueueSize=2048`, **two 499 client-cancels** logged (6 439 ms / 2 890 ms) | Enable `EnableThrottling=true` and `EnableSegmentDeletion=true` for CPU-only era — caps wasted ffmpeg CPU after client disconnect | S |
| 04 | OpenSubtitles auth | **R** | `Username`/`Password` empty in `Jellyfin.Plugin.OpenSubtitles.xml`; **102** `Error downloading subtitles from Open Subtitles` lines / 6 h | Set creds via UI, OR disable the provider on both libraries (`EnableInternetProviders=false` already; subtitle search still runs). Doc 03-subtitles.md already calls this out as pending | S |
| 05 | Cache trash budget | **Y** | `EnableSegmentDeletion=false`, `SegmentKeepSeconds=720`; `/cache/transcodes` only 20 K right now (no live stream), but a 4K HEVC→h264 session will fill GiBs and not auto-prune | Enable `EnableSegmentDeletion=true` (default 720 s keep is fine) — pairs with finding 03 | S |
| 06 | Backup posture | **R** | `/home/docker/jellyfin/config/` (104 MB) has no off-host rotation; `snapshots/` in repo only holds pre-ElegantFin baseline | Add a weekly `tar.zst` of `/config/` (excluding `log/`, `cache/`) to NAS or git-backed snapshot dir | M |
| 07 | Disk pressure | **Y** | `/home` 90 % full, 40 GiB free of 399 GiB; `/home/user/media` only 189 files | Cap on media growth: at current free space + episode bitrate budget user has ~34 more series before disk fills | M |
| 08 | DB WAL ratio | **Y** | `library.db`=3.3 MB, `library.db-wal`=4.4 MB (WAL > main, uncheckpointed). `Optimize database` last ran 2026-05-08T00:58 (OK) but a fresh scan completed 03:16 left WAL fat | Either trigger a manual `Optimize database` post-scan, or shorten its schedule to "after every full scan". WAL > main is normal during/after a scan but should checkpoint on idle | S |
| 09 | Custom CSS bloat | **Y** | `CustomCss` in `branding.xml` is **25 225 bytes**, 17 `!important`, sole `@import` is `MRunkehl/cineplex@v1.0.6` (jsDelivr) | jsDelivr import adds 1 round-trip + ~50 KB on every cold cache load. Inline the import for offline-resilience and one-fewer DNS hop. Also doc 11 already flags this as the wrong theme (Cineplex, not NeutralFin) — resolve theme race first | M |
| 10 | SPA shim cost | **G** | `web-overrides/index.html` 58 KB; runs **2× MutationObserver** + **1× setInterval(1000ms)** with `lockTitle/lockFavicon/nukeSettings`; cost ~1 ms per tick | Acceptable for a single-tab branding shim; would be a problem only on background tabs at scale. No action | — |
| 11 | Service worker | **G** | `/web/serviceworker.js` 768 bytes, last modified 2024-11-19 (Jellyfin 10.10.3 ship date), serves with `cache-control: no-store` (HTTPS, etag set). Notification-only SW (per doc 10) | No action — it is small and not caching `index.html` so cannot pin stale branding | — |
| 12 | Metrics endpoint | **G** | `EnableMetrics=false` | Off is correct for a single-server box. No action | — |
| 13 | Slow-response warning | **Y** | `EnableSlowResponseWarning=true`, threshold **500 ms**. Two transcoding 499s above 2.8 s would normally trigger this warning, but I see 0 `slow` lines in 1 h logs | Either Jellyfin's slow log only fires on synchronous request handlers (not HLS segment GETs), or warning suppressed by another setting. Worth confirming threshold semantics | S |
| 14 | Library scan concurrency | **Y** | `LibraryScanFanoutConcurrency=0`, `LibraryMetadataRefreshConcurrency=0`, `ParallelImageEncodingLimit=0` (all defaults — auto = `ProcessorCount`) | On a 12-core box already at load 11+, `0` (= 12) for all three is aggressive. Cap each at 46 to leave headroom for Forgejo/Traefik/etc | S |
| 15 | Realtime monitor | **Y** | Both libraries have `EnableRealtimeMonitor=true`; only 189 files; `LibraryMonitorDelay=60` | Fine for current size, but inotify watches grow with file count. Re-evaluate at 10 k+ files | — |
| 16 | Trickplay / chapter previews | **G** | `EnableTrickplayImageExtraction=false`, `ExtractChapterImagesDuringLibraryScan=false`, `EnableChapterImageExtraction=false`, `ExtractTrickplayImagesDuringLibraryScan=false` (all libs) | Disabled on both libraries — saves significant CPU. No action. (Note: scheduled task `Generate Trickplay Images` still ran 02:00 — check it is a no-op when libs say no) | — |
| 17 | Photos library | **G** | `EnablePhotos=false` on both | Correct for a movies/TV deploy. No action | — |
| 18 | Plugin set | **G** | 6 plugins active (AudioDB, MusicBrainz, OMDb, OpenSubtitles, StudioImages, TMDb). `Username/Password` empty for OMDb (= no key, falls back to anon rate limit) and TMDb (`TmdbApiKey` empty — falls back to bundled key) | Both tolerated. AudioDB + MusicBrainz unused (no music libs) but cost zero idle. Consider removing for minimalism, not perf | — |
| 19 | Admin user policy | **R** | `s8n` admin has `EnableRemoteControlOfOtherUsers=true`, `EnableContentDeletion=true` (correct for admin) but **also `IsHidden=true`** | Hidden admin is non-standard; usually a hidden admin is reserved for automation. If `s8n` is the operator's daily account, `IsHidden=false` is the convention. Low risk, just unusual | S |
| 20 | Non-admin policies | **Y** | All 4 non-admin users (`5`, `guest`, `house`, `marco`) have `EnableContentDownloading=true`, `EnableMediaConversion=true`, `EnableLiveTvManagement=true`, `EnableSharedDeviceControl=true`, `IsHidden=true` | LiveTvManagement on accounts with no Live TV is dead weight, no harm. ContentDownloading + MediaConversion let any user kick off transcodes — a foot-gun on a CPU-only host. Review desired stance | S |
| 21 | Login disclaimer leak | **G** | `LoginDisclaimer` = "Welcome to ARRFLIX - Private invite only service" | Public-facing string is intentional per doc 09. No action | — |
| 22 | Public WAN exposure | **Y** | `EnableRemoteAccess=true`, `no-guest@file` middleware **dropped** in compose (per doc 09 §1.2). 24 h log: 270 LAN reqs, **59 reqs from 157.143.84.87, 1 from 82.31.156.86** | Doc 09 confirms this is intentional. The 157.143.84.87 hits are bot-style asset-prober 404s — harmless but confirms the service is internet-reachable. No action; re-verify rate limit / fail2ban once router port-forward is active | — |
| 23 | Splashscreen size | **Y** | `/config/data/splashscreen.png` is **3.0 MB** | A splash image of 3 MB is large for a PNG; lossless re-encode or downscale to ≤500 KB; saves on first-paint over WAN | S |
| 24 | Log rotation | **G** | `LogFileRetentionDays=3`; `/config/log` 1.3 MB; rotation working | No action | — |
| 25 | Splashscreen flag | **Y** | `SplashscreenEnabled=true` in `branding.xml` | Intentional for branding, no action — pairs with finding 23 (just shrink the file) | — |
| 26 | Cache breakdown | **G** | `/cache/images` 15 MB (entire cache 15 MB); `/config/metadata` 92 MB; `/config/data` 12 MB; `/config/plugins` 128 KB | Healthy small footprint. No action | — |
| 27 | Forgejo log noise | **Y** | Traefik logs show `forgejo@docker` returning **401** for `s8n/ARRFLIX.git/info/refs?service=git-receive-pack` 8× / hour from 192.168.0.10 | Out of scope for this deploy but indicates a stale `git push` retry loop on onyx — surfaces here only because we're scanning traefik logs. Mention to operator separately | — |
| 28 | Path substitutions | **G** | `system.xml` empty `<PathSubstitutions />` and `<CorsHosts />` | Correct (no NFS/SMB indirection, no cross-origin clients). No action | — |
| 29 | LiveTV residue | **G** | `DisableLiveTvChannelUserDataName=true`; no Live TV configured; per-user `EnableLiveTvAccess=true` is dead weight | Cosmetic; no perf cost. No action | — |
| 30 | Container restart count | **G** | `docker inspect` `RestartCount=0`, `Status=running`, `StartedAt=2026-05-08T02:13:01` (~2 h uptime, healthy) | No action. (Boot was at 02:13, suggests the compose was applied for doc-09 WAN flip and ran clean since) | — |
| 31 | Network XML hygiene | **Y** | `KnownProxies` empty, `LocalNetworkSubnets` empty, `LocalNetworkAddresses` empty | Jellyfin can't tell the Traefik 172.20.0.0/16 docker net from random WAN — every external IP is logged as remote, which inflates Jellyfin's geoIP/session bookkeeping. Set `KnownProxies=172.20.0.0/16` and `LocalNetworkSubnets=192.168.0.0/24` | S |
| 32 | TLS cert | **G** | LE cert valid `2026-05-08 → 2026-08-06` (89 days remaining), issued by R13, Gandi DNS-01 resolver, in `acme.json` | Healthy. No action | — |
| 33 | Request-rate posture | **G** | 81 req / hour total via traefik; 62 of those are `jellyfin@docker`. Top src 192.168.0.10 (LAN, the operator), then 157.143.84.87 (asset-prober 404s) | Low rate. No action — re-evaluate if WAN exposure draws more traffic | — |
| 34 | Idle session count | **G** | `/Sessions` returns 2 idle (s8n + guest) on 192.168.0.10; no playback in flight at audit time | No action | — |
| 35 | Item counts | **G** | 2 movies, 6 series, 169 episodes; matches `find /media -type f` (189 files, accounting for non-video extras) | Library scan is healthy; counts converged | — |
---
## Recommended fix order (top 5 by impact-per-effort)
1. **Finding 03 — enable transcode throttling + segment deletion.**
*Effort: S (two checkboxes in Playback settings).* Closes the
highest-cost behaviour we have evidence of (the 499 ms wall events).
Saves CPU cycles per stalled client.
2. **Finding 04 — set OpenSubtitles credentials, OR disable
provider.** *Effort: S.* Removes 102 ERR/6 h of log spam, fixes
subtitle download, immediately restores log signal.
3. **Finding 31 — populate `KnownProxies` + `LocalNetworkSubnets` in
`network.xml`.** *Effort: S.* Restores accurate session origin
reporting; needed before any rate-limiting or fail2ban work post-WAN.
4. **Finding 14 — cap `LibraryScanFanoutConcurrency`,
`LibraryMetadataRefreshConcurrency`, `ParallelImageEncodingLimit`
to 46.** *Effort: S.* Stops a future scan piling on top of the
existing host load (currently 11.4).
5. **Finding 06 — automate `/config/` backup.** *Effort: M.* Single
highest-blast-radius risk: a corrupt `library.db` or a `branding.xml`
regression and you've lost the user accounts AND the theme work in
one go. A weekly `tar.zst` to NAS closes this.
GPU re-enable (finding 02) would unlock more wins but is **L** effort
and lives outside Jellyfin (host driver work). Throttling (#03) is the
right CPU-era patch until then.
---
## Out of scope (audited and found healthy)
- **Service worker** (`/web/serviceworker.js`, 768 B, notification-only,
not caching index.html — finding 11).
- **Container restart count** (0 — finding 30).
- **TLS cert chain** (89 days valid — finding 32).
- **Trickplay / chapter / photo extraction** (all disabled — findings
16, 17).
- **Log rotation** (3-day retention working, 1.3 MB /config/log —
finding 24).
- **Cache directory growth** (15 MB total, healthy — finding 26).
- **Plugin set** (6 plugins, all idle-cheap — finding 18).
- **Idle session footprint** (2 idle web sessions, no playback in
flight — finding 34).
- **Item count convergence** (Items/Counts matches filesystem —
finding 35).
- **Path substitution / CORS hygiene** (empty as expected — finding 28).
- **Login disclaimer string** (per-doc-09 intentional public-facing
text — finding 21).
---
## Appendix — raw evidence
### Host
```
uptime: 04:18:55 up 4 days, 4:36, 3 users, load average: 11.40, 9.59, 6.19
nproc: 12
free -h: total 31Gi, used 9.2Gi, free 5.8Gi, swap used 6.8Gi / 24Gi
df -h /home: 399G total, 339G used, 40G avail (90 % full)
```
### Container
```
docker stats jellyfin (no-stream):
CPU 0.01 %, MEM 521.5 MiB / 31.27 GiB (1.63 %), PIDS 24, NET 83.8 MB / 361 MB
docker inspect: Restarts=0, Started=2026-05-08T02:13:01Z, Status=running
```
### GPU
```
nvidia-smi: NVIDIA-SMI has failed because it couldn't communicate with the NVIDIA driver
lsmod | grep nvidia: (no matches)
ls /dev/nvidia*: No such file or directory
encoding.xml: HardwareAccelerationType=none, EnableHardwareEncoding=true
```
### Disk
```
/config 104 M (data 12M, metadata 92M, log 1.3M, plugins 128K)
/cache 15 M (images 15M, transcodes 20K, fontconfig 36K, omdb 84K)
/home/docker/jellyfin: not visible (sudo blocked); inferred from container view
```
### Database
```
jellyfin.db 208 K (WAL 473 K, SHM 32 K)
library.db 3.3 M (WAL 4.4 M, SHM 32 K) <- WAL > main
keyframes/ 16 K
splashscreen.png 3.0 M
```
### Traefik (last 1 h)
```
total log lines: 279
jellyfin@docker requests: 62
status 499 (client cancel): 2 (HLS segments, 6439 ms + 2890 ms)
status 5xx: 0
top source IPs (jellyfin):
82.31.156.86 123 (own WAN egress, hairpin)
82.131.116.123 122 (external — likely friend / scanner)
192.168.0.10 13 (operator LAN)
173.244.58.11 2 (cloud scanner)
35.203.85.72 1 (Google security scan)
```
### Jellyfin (last 6 h)
```
"Error downloading subtitles from Open Subtitles": 102
"slow" / "throttl" matches: 1 (false positive, no real slow-warn)
Container restart events: 0
```
### TLS
```
Subject: CN=arrflix.s8n.ru
Issuer: C=US, O=Let's Encrypt, CN=R13
Valid: 2026-05-08 00:58:11 GMT → 2026-08-06 00:58:10 GMT (89 d)
Resolver: letsencrypt (Gandi DNS-01)
```
### Service worker
```
URL: https://arrflix.s8n.ru/web/serviceworker.js
HTTP: 200, content-type text/javascript
Size: 768 bytes
Last-Modified: Tue, 19 Nov 2024 03:43:48 GMT (Jellyfin 10.10.3 ship)
Headers: HSTS preload + nosniff + frame=SAMEORIGIN + xss-protection
```
### CSS / branding
```
/Branding/Configuration:
CustomCss bytes: 25 225
!important rules: 17
sole @import: https://cdn.jsdelivr.net/gh/MRunkehl/cineplex@v1.0.6/cineplex.css
LoginDisclaimer: "Welcome to ARRFLIX - Private invite only service"
SplashscreenEnabled: True
on disk:
/config/config/branding.xml 25 584 bytes
```
### SPA shim
```
/opt/docker/jellyfin/web-overrides/index.html 58 725 bytes
MutationObserver count: 2 (one head/title-favicon, one body/nukeSettings)
setInterval count: 1 (1000 ms — relocks title + favicon + nukeSettings)
```
### Users
```
# users: 5
admin (s8n): IsHidden=true, EnableRemoteControlOfOtherUsers=true, EnableContentDeletion=true
non-admin (5, guest, house, marco): IsHidden=true, EnableContentDownloading=true,
EnableMediaConversion=true, EnableLiveTvManagement=true
```
### Plugins
```
AudioDB 10.10.3.0 Active
MusicBrainz 10.10.3.0 Active RateLimit=1, ReplaceArtistName=false
OMDb 10.10.3.0 Active CastAndCrew=false
Open Subtitles 20.0.0.0 Active Username/Password empty, CredentialsInvalid=false
Studio Images 10.10.3.0 Active
TMDb 10.10.3.0 Active TmdbApiKey empty
```
### Library options (both libs)
```
EnableRealtimeMonitor = True
ExtractChapterImagesDuringLibraryScan = False
EnableTrickplayImageExtraction = False
EnablePhotos = False
SaveLocalMetadata = False
EnableInternetProviders = False
SkipSubtitlesIfAudioTrackMatches = True
SaveSubtitlesWithMedia = True
ExtractTrickplayImagesDuringLibraryScan= False
```
### Network XML
```
EnableHttps=false (TLS handled by Traefik) | EnableUPnP=false | EnableRemoteAccess=true
KnownProxies=(empty) LocalNetworkSubnets=(empty) LocalNetworkAddresses=(empty)
IgnoreVirtualInterfaces=true VirtualInterfaceNames=[veth]
EnablePublishedServerUriByRequest=false
```
### System config — performance knobs
```
LogFileRetentionDays = 3
EnableMetrics = False
EnableSlowResponseWarning = True (threshold 500 ms)
RemoteClientBitrateLimit = 0 (no cap)
LibraryScanFanoutConcurrency = 0 (auto = ProcessorCount = 12)
LibraryMetadataRefreshConcurrency = 0 (auto = ProcessorCount = 12)
ParallelImageEncodingLimit = 0 (auto = ProcessorCount = 12)
EnableNormalizedItemByNameIds = True (correct for 10.10.x)
QuickConnectAvailable = False
EnableCaseSensitiveItemIds = True
EnableFolderView = False
EnableGroupingIntoCollections = False
IsStartupWizardCompleted = True
ChapterImageResolution = (default)
DummyChapterDuration = (default)
ImageExtractionTimeoutMs = (default)
LibraryMonitorDelay = 60
LibraryUpdateDuration = 30
ActivityLogRetentionDays = (default)
```
### Encoding config — full dump
```
EncodingThreadCount = -1 (auto)
EnableAudioVbr = False
MaxMuxingQueueSize = 2048
EnableThrottling = False ← finding 03
ThrottleDelaySeconds = 180
EnableSegmentDeletion = False ← finding 05
SegmentKeepSeconds = 720
HardwareAccelerationType = none ← finding 02
EncoderAppPathDisplay = /usr/lib/jellyfin-ffmpeg/ffmpeg
VaapiDevice = /dev/dri/renderD128 (no Intel iGPU on host)
H264Crf = 23
H265Crf = 28
EncoderPreset = (nil)
EnableHardwareEncoding = True (no-op while type=none)
AllowHevcEncoding = False
AllowAv1Encoding = False
EnableSubtitleExtraction = True
HardwareDecodingCodecs = [h264, vc1]
AllowOnDemandMetadataBasedKeyframeExtractionForExtensions = [mkv]
PreferSystemNativeHwDecoder = True
EnableEnhancedNvdecDecoder = True (no-op while no nvidia)
```
### Scheduled tasks
```
Audio Normalization Idle Completed 2026-05-08T00:58
Clean Cache Directory Idle Completed 2026-05-08T00:58
Clean Log Directory Idle Completed 2026-05-08T00:58
Clean Transcode Directory Idle Completed 2026-05-08T02:13
Download missing subtitles Idle Completed 2026-05-08T00:58
Extract Chapter Images Idle Completed 2026-05-08T01:00
Generate Trickplay Images Idle Completed 2026-05-08T02:00 (no-op?)
Optimize database Idle Completed 2026-05-08T00:58
Refresh People Idle Completed 2026-05-08T00:58
Scan Media Library Idle Completed 2026-05-08T03:16
Update Plugins Idle Completed 2026-05-08T02:13
```
---
## Sign-off
- Audit: 2026-05-08, read-only, ~25 min wall.
- No fixes applied. No state mutated. No container restart.
- Next audit due: **2026-08-08** (quarterly, before LE cert renewal
window opens at 2026-08-06).

View file

@ -1,617 +0,0 @@
# 14 — Theme Audit + Detail-Page Backdrop Diagnosis
Status: **read-only audit**, executed 2026-05-08 against
`https://arrflix.s8n.ru` (Jellyfin 10.10.3 on nullstone). The owner has
just rolled back to **Cineplex v1.0.6** (the Netflix-faithful theme)
after a brief ElegantFin → NeutralFin experiment that was documented in
docs 04 §3e and 11 respectively. Reported issue: on detail pages the
**backdrop image leaves a visible vertical black band on the left** where
the title/info column sits. Owner asked for a forward plan, not a fix.
> **No state mutated.** No POST to `/System/Configuration/branding`,
> no edit to `/jellyfin/jellyfin-web/index.html`, no docker action.
> Read-only over SSH and against the public `/Branding/Configuration`
> + authenticated `/System/Configuration/branding` endpoints.
---
## 1. Current state inventory
### 1a. Active theme
`/System/Configuration/branding` returns:
| Field | Value |
|---|---|
| `LoginDisclaimer` | `"Welcome to ARRFLIX - Private invite only service"` |
| `SplashscreenEnabled` | `true` |
| `CustomCss` (size) | **25 225 chars** (most of which is the embedded ARRFLIX wordmark data-URL — twice) |
Sole `@import` line:
```css
@import url("https://cdn.jsdelivr.net/gh/MRunkehl/cineplex@v1.0.6/cineplex.css");
```
Cineplex itself transitively imports
`cineplex@v1.0.5/finity-theme/finity-complete.css` (its parent theme,
**Finity** by prism2001). This matters for the backdrop diagnosis below.
### 1b. CustomCss block inventory (every rule, in order)
`!important` declarations: **17**. `#E50914` occurrences: **0** in
CustomCss; **1** in `web-overrides/index.html` critical-path `<style>`.
ARRFLIX wordmark PNG: **235 × 85 px** (aspect 2.765 : 1), embedded
as base64 data-URL on two selectors.
| # | Block | Selectors | Purpose | `!important` count |
|---|---|---|---|---|
| 1 | Cineplex import | `@import` | Theme entry point | 0 |
| 2 | Cast/Crew hide | `#castCollapsible, #guestCastCollapsible` | Drop reviewer cruft | 1 |
| 3a | ARRFLIX logo (img) | `.adminDrawerLogo img` | `content:` replace src in admin drawer | 1 |
| 3b | ARRFLIX logo (div) | `.pageTitleWithLogo` | `background-image:` for masthead `<div>` | 1 |
| 4 | Quick Connect hide | `.btnQuick` | Belt-and-braces for the server-side disable in 04 §4g | 1 |
| 5 | Header icon hide | `.headerSyncButton`, `.headerCastButton`, `.headerUserButton` | Keep only Search top-right | 3 |
| 6a | Slider thumbs (white) | `.MuiSlider-thumb`, `.osdPositionSlider .MuiSlider-thumb`, `.osdVolumeSlider .MuiSlider-thumb`, `emby-slider .sliderThumb` | OSD scrubber + volume circles | 3 |
| 6b | Slider thumbs (focus halo) | `.MuiSlider-thumb:hover/:active/.Mui-focusVisible` | Hover ring | 1 |
| 7a | Pure-black bg (vars) | `:root { --primary-background-color/--background-color: #000 }` | Force shell vars to true black | 2 |
| 7b | Pure-black bg (wrappers) | `html, body, .preload, .skinBody, .mainDrawerHandle` | Anti-flash on shell wrappers | 1 |
| 7c | Pure-black bg (containers) | `.skinHeader, .skinHeader.semiTransparent, .skinHeader.skinHeader-withBackground, .mainAnimatedPages, #reactRoot, .dashboardDocument` | Container surfaces | 1 |
| 8 | Settings drawer hide | `a[href*="mypreferencesmenu"]`, `[to="/mypreferencesmenu.html"]` and `:has()` parent variants × 7 | Remove Settings link from drawer | 1 |
| 9 | Count-badge hide | `.countIndicator` | Drop unwatched-episode badges | 1 |
### 1c. Critical-path inline `<style>` (in `web-overrides/index.html`)
Bind-mounted at `/jellyfin/jellyfin-web/index.html`, paints **before** the
SPA bundle loads CustomCss:
| Block | Effect |
|---|---|
| `:root { --primary-background-color: #000; --background-color: #000 }` | Pre-paint shell vars (no `!important`) |
| `html, body, .preload, .skinBody, .skinHeader, #reactRoot, .mainAnimatedPages { bg:#000 !important; color:#fff !important }` | Anti-flash + force colour |
| `.raised, .button-submit, .emby-button[type=submit], button[type=submit] { bg:#E50914 !important; color:#fff !important }` | Pre-paint Netflix-red on submits (login Sign-In) |
| `.splashLogo { animation: fadein .5s; width:30%; height:30%; bg-image:<ARRFLIX wordmark data-URL>; bg-size:contain; bg-position:center; position:fixed; top:50%; left:50%; transform:translate(-50%,-50%) }` | The pre-bundle splash screen |
| `@media (min-device-width:992px) { .splashLogo { bg-image:<same ARRFLIX wordmark, full-res copy> } }` | Desktop variant (currently identical bytes — see §6) |
Plus 78 lines of inline `<script>` (ARRFLIX-SHIM) that locks
`document.title`, the favicon, and continuously hides any
`mypreferencesmenu` drawer entry that might be rendered after navigation.
None of the JS touches detail-page layout.
---
## 2. Detail-page backdrop diagnosis
### 2a. Selector hunt against the live JF 10.10.3 web bundle
`docker exec jellyfin grep -oE` against
`/jellyfin/jellyfin-web/main.jellyfin.1ed46a7a22b550acaef3.css` and
`itemDetails-index-html.ca5f15ff794311af00a6.chunk.js` returned the
canonical detail-page selector set:
| Selector | Where defined | Stock JF 10.10.3 layout |
|---|---|---|
| `.itemBackdrop` | `main.jellyfin.<hash>.css` | `height: 40vh; width: <inherited>; background-size: cover; background-attachment: fixed; position: relative;`**only top 40vh of the page** |
| `.layout-mobile .itemBackdrop` | same | `background-attachment: scroll; background-position: top` |
| `.layout-tv .itemBackdrop` | same | `display: none` |
| `.detailPageContent` | same | `display: flex; flex-direction: column; padding-left: 32.45vw` (LTR desktop) — i.e. the content column starts 32.45% from the left |
| `.detailPagePrimaryContainer` | same | `display: flex; align-items: center; z-index: 2;` desktop adds `padding-left: 32.45vw` |
| `.detailImageContainer .card` | same | `position: absolute; top: -80%; left: 3.3%; width: 25vw` (desktop) — the poster card sits in the LEFT column |
| `.detailLogo` | same | `position: absolute; top: 10vh; right: 25vw; width: 25vw; height: 16vh; background-size: contain` |
| `.detailRibbon` | same | desktop: `height: 7.2em; margin-top: -7.2em` (the gradient fade strip below backdrop) |
| `.itemBackdropProgressBar` | same | `position: absolute; bottom:0; left:0; right:0` |
| `.detailPageWrapperContainer` | same | `border-collapse: collapse` |
There is **no** `itemBackdropFader`, no `itemHeroSection`, no
`backdropHeroSection` selector in the bundle. The owner's mental model of
"a fader covering the left column" doesn't match — the architecture is
*positional offsets*, not an overlay.
### 2b. What Cineplex/Finity overrides
`grep -nE "itemBackdrop|detailPagePrimary|detailPageContent|detailLogo|detailImageContainer|detailRibbon|detailPageWrapper" /tmp/cineplex.css /tmp/finity.css` shows:
**`cineplex.css`** — only **two** detail-page rules, both of them
mobile-only. No desktop override of `.itemBackdrop`.
```css
/* line 577 */
.layout-mobile .itemBackdrop {
margin-top: 0rem;
mask-image: linear-gradient(to top, #fff0 1%, #000 15%, #000 80%, #fff0 100%);
}
```
**`finity-complete.css`** — Finity is where the detail-page layout is
heavily redesigned. Key block:
```css
/* finity.css :root */
--detail-page-side-padding: 5%;
--detail-page-primary-width: 45%;
--detail-page-backdrop-offset: 17%; /* <-- THE BLACK BAND */
--detail-page-backdrop-width: 85vw;
--detail-page-mask-offset: 16%;
--detail-page-mask-width: 85vw;
--detail-page-content-offset: -65vh;
.layout-desktop .itemBackdrop {
background-attachment: scroll;
background-position: center;
background-size: cover;
height: 100vh; /* full viewport, NOT 40vh — Finity expands JF default */
width: 100%;
}
.backdropContainer {
height: 100vh;
left: var(--detail-page-backdrop-offset); /* 17% */
position: absolute;
top: 0;
width: var(--detail-page-backdrop-width); /* 85vw */
z-index: 0;
pointer-events: none;
}
.layout-desktop .backgroundContainer.withBackdrop {
background: url("https://raw.githubusercontent.com/prism2001/finity/main/assets/mask.png");
background-size: cover;
height: 100vh;
left: var(--detail-page-mask-offset); /* 16% */
width: var(--detail-page-mask-width); /* 85vw */
z-index: 1;
pointer-events: none;
}
.layout-desktop .detailImageContainer .card { display: none; } /* hide poster card */
```
### 2c. Root cause
The "black band on the left" is **Finity's intentional design**, not a
Cineplex bug and not a JF stock layout artefact:
- Stock Jellyfin: `.itemBackdrop` is `height: 40vh` and full-width
(`width` is inherited from the parent flow). The backdrop crops the
*top* of the page, the info column lays out below it. No left band.
- Finity: re-engineers the page so `.itemBackdrop` is `100vh` *but*
positions a separate `.backdropContainer` absolutely at `left: 17%
width: 85vw` (so the right ~98% of the viewport gets the backdrop and
the left **17vw / 17%** is left clear). On top of that, a blurred
`mask.png` is overlaid at `left: 16%` to fade the right edge of the
remaining clear band into the backdrop — making the band look like a
designed gradient sidebar, NOT a black bar.
The reason it currently reads as **a hard black band** rather than a
soft gradient fade is the combination of two of our personal tweaks
plus one Finity asset that may not be reaching the browser:
1. **`html, body, .preload, .skinBody, .mainDrawerHandle { bg:#000 !important }`**
forces the underlying surface where the band sits to pure black.
Finity's `--theme-background-color: #181818` is the intended
surface — slightly less harsh.
2. **`#reactRoot, .mainAnimatedPages, .dashboardDocument { bg:#000 !important }`**
does the same for the SPA wrappers above the body.
3. The Finity mask overlay
(`.backgroundContainer.withBackdrop`) loads its mask PNG from
`raw.githubusercontent.com/prism2001/finity/main/assets/mask.png`
on a LAN with no upstream proxy that should resolve, but if the
browser blocks third-party image loads (some ad-blockers strip
`raw.githubusercontent.com` requests) the mask never paints and the
17vw band is unmasked. Worth a DevTools network-tab check before any
CSS change.
Net: the backdrop **is** filling the right 85vw of the viewport. The
left 17vw is intentionally clear so the title/poster/info column has a
high-contrast surface to render on. Our `bg:#000 !important` rules turn
that intentionally-clear surface into a hard black band; without them
it would be `#181818` with a soft gradient fade from the mask PNG.
### 2d. Forward-plan CSS (DO NOT APPLY)
If the goal is **Netflix-style full-bleed backdrop with a left-side
gradient overlay** (info column floating over a darkened-but-visible
backdrop), the proposed rule set is:
```css
/* Detail-page backdrop: full-bleed + left gradient overlay
(proposal — not applied) */
/* 1. Stretch the backdrop container across the full viewport
instead of starting at 17vw */
.layout-desktop .backdropContainer {
left: 0 !important;
width: 100vw !important;
}
/* 2. Replace Finity's mask.png with a CSS-only linear gradient
that darkens the left 40-50vw and fades to transparent.
`.backgroundContainer.withBackdrop` is the overlay layer. */
.layout-desktop .backgroundContainer.withBackdrop {
background: linear-gradient(
90deg,
rgba(0, 0, 0, 0.95) 0%,
rgba(0, 0, 0, 0.85) 25%,
rgba(0, 0, 0, 0.55) 45%,
rgba(0, 0, 0, 0.20) 65%,
rgba(0, 0, 0, 0.00) 85%
) !important;
left: 0 !important;
width: 100vw !important;
}
/* 3. Drop the global black-bg force from the wrappers ON DETAIL
PAGES ONLY so the gradient composes against the actual
backdrop, not pure black. Scope by .itemDetailPage body class
that JF adds on detail routes. */
body.itemDetailPage,
body.itemDetailPage #reactRoot,
body.itemDetailPage .mainAnimatedPages {
background-color: transparent !important;
}
```
The `90deg, 95% → 0%` gradient is the Netflix.com detail-page recipe:
opaque on the left where the title sits, fades to transparent by ~70vw
so the right side of the backdrop is visible at full brightness. Tune
the stop percentages once live — the sweet spot depends on
`--detail-page-primary-width` (Finity ships `45%`).
**Untested side-effect to watch for:** Finity *also* hides the poster
card with `.layout-desktop .detailImageContainer .card { display:none }`.
That means we have NO poster in the left column today — the current
black band is empty space framing a clear logo + title block. The fix
above would put the title text directly over the backdrop, which is
fine on most artwork but may have legibility issues on bright/busy
backdrops. If owner wants the poster back, drop that Finity rule too.
### 2e. Screenshot reference
A capture of `https://arrflix.s8n.ru/web/#/details?id=324f75b84f394a5d9b0749c0679f23b9`
(Rick & Morty S01E01 "Pilot") with a hard browser reload would show:
- Top: ~17vw black/empty band on the left, Rick & Morty backdrop on
the right ~83vw. (Finity / current.)
- Title "Pilot" + Series logo + Play button float over the empty band.
- After fix: title floats over a darkened-but-visible portion of the
same backdrop, gradient eases into the un-darkened backdrop on the
right ~30%.
Owner has not provided a current screenshot in this audit; capture
recommended before any CSS change so before/after is documented.
---
## 3. Theme survey 2026-05
Surveyed candidates (live as of audit date), scored on Netflix
fidelity, monochrome fidelity, recency, JF 10.10.3 compatibility,
import format, license:
| Theme | Last commit | License | Netflix fidelity | Monochrome fidelity | JF 10.10.3 compat | Import | Notes |
|---|---|---|---|---|---|---|---|
| **Cineplex v1.0.6** (current) | 2025-09-06 | MIT | **9/10** — true `#E50914`, Netflix Sans webfont, scale-hover, login backdrop | 2/10 | YES (verified live) | single `@import` (transitively pulls Finity) | Bus-factor 1 (single author MRunkehl, 0 stars). Inherits Finity's left-band detail-page layout. |
| **ElegantFin v25.12.31** | 2026-04-30 | GPL-2.0 | 5/10 — Jellyseerr blue/violet by default, recolour-able to `#E50914` (eight `--var` overrides documented in 04 §3e) | 5/10 | YES (tested 10.11.5) | single `@import` | Most actively maintained CSS theme in the ecosystem. Detail-page backdrop is full-width with a gradient overlay built in — no left band. |
| **NeutralFin v1.3.0** | 2025-11-24 | GPL-2.0 | 1/10 (mid-grey accents, no red) | **9/10**`#131313 → #1e1e1e` gradient, mid-grey accents, off-white text | YES (tested implicitly via ElegantFin parent) | single `@import` | Fork of ElegantFin. The "didn't look as good" feel was caused by our `bg:#000 !important` rules clamping its `#131313→#1e1e1e` gradient flat (see doc 11). With those dropped it would render correctly. |
| **Theme Park (jellyfin pack)** | active | GPL-3.0 | n/a — **no Netflix preset** (only aquamarine/hotline/dracula/dark/organizr/space-gray/plex/nord) | varies by preset | likely | single `@import url(theme-park.dev/css/base/jellyfin/<NAME>.css)` | DQ for our brief; closest is `plex` (orange/black) but that's a different brand entirely. |
| **JellyFlix** (prayag17) | 2023-12-20 | none | 9/10 — origin of the genre | 1/10 | **HALTED** (README header) | single `@import` | DQ — explicitly halted, broken on JF 10.11, risky on 10.10.3 |
| **DarkFlix v5.1** | 2024-06 | GPL-3.0 | 8/10 | 1/10 | only declares 10.8.x; **requires 67% browser zoom** | single `@import` | DQ — accessibility issue, no 10.10 statement |
| **Ultrachromic** (CTalvio) | "selectively maintained" — 146 commits, no recent date | MIT | 6/10 (accent-tunable) — three presets: Monochromic, Kaleidochromic, Novachromic | 8/10 (Monochromic preset) | unspecified | single `@import` per preset | "Old, passively maintained." No Netflix preset, but Novachromic accepts custom accents — could be set to `#E50914`. |
| **Finity** (prism2001, Cineplex's parent) | 2026-05 (active) | none stated | 6/10 (dark, modern, no Netflix red by default) | 5/10 | unspecified | single `@import` | Fully responsible for the detail-page layout we see on Cineplex. If the backdrop fix lands, we'd be fixing Finity's `.backdropContainer` rules. |
| **abyss-jellyfin** (AumGupta) | 2026-05 | n/a | 1/10 | 7/10 | unspecified | unknown | "Minimal dark." 290 stars, growing. Not Netflix-flavoured. |
| **FossFlix** (PaleCache) | 2026-01 | n/a | 6/10 (claims Netflix UI similarity) | 1/10 | unspecified | unknown | 1 star, unproven. Worth bookmark, not migration. |
| **JellyFin** (n00bcodr) | 2026-05 | n/a | 0/10 | 6/10 | unspecified | unknown | Inspired by Flow + Zesty — neither fits the brief. |
| **JellyThemes** (kingchenc) | 2026-01 | n/a | 0/10 | varies (six dark themes with glassmorphism) | unspecified | unknown | DQ for Netflix brief. |
| **Hybrid: Cineplex + NeutralFin tweaks** | n/a | derivative | 7/10 | 4/10 | YES if grafted carefully | one `@import` + tweaks | Not actually possible to graft cleanly — Cineplex's red and NeutralFin's grey both define `--theme-accent-color` / `--uiAccentColor` at `:root`, last-write-wins. Picking the import = picking the palette. Ranges of personal-tweak overrides (e.g. `.MuiSlider-thumb:white`) DO survive across both. |
### 3a. Verdict on Theme Park
`docs.theme-park.dev/themes/jellyfin/` lists eight presets: Aquamarine,
Hotline, Dracula, Dark, Organizr, Space-gray, Plex, Nord. **No Netflix
preset.** The closest cousin (`hotline`) is a magenta/cyan synthwave
look, not Netflix-red. Theme Park is therefore not a viable migration
target for the ARRFLIX brand; ruled out.
---
## 4. Personal-tweak portability matrix
For each personal-tweak block in current `CustomCss`, classify the
selector as **theme-independent** (generic Jellyfin selector, survives
any swap) vs **theme-specific** (requires re-targeting).
| # | Block | Selector | Type | Cineplex | ElegantFin | NeutralFin | Theme-Park | Portability |
|---|---|---|---|---|---|---|---|---|
| 2 | Cast/Crew hide | `#castCollapsible, #guestCastCollapsible` | Generic JF id | works | works | works | works | **HIGH** |
| 3a | Logo (admin) | `.adminDrawerLogo img` | Generic JF class | works | works (per 04 §3e — verified 0 ElegantFin matches) | works (no NeutralFin matches) | works | **HIGH** |
| 3b | Logo (masthead) | `.pageTitleWithLogo` | Generic JF class | works (with `bg-image`, NOT `content:`) | works (verified) | works | works | **HIGH** |
| 4 | Quick Connect hide | `.btnQuick` | Generic JF class on `<button>` | works | works | works | works | **HIGH** |
| 5 | Header icons hide | `.headerSyncButton`, `.headerCastButton`, `.headerUserButton` | Generic JF classes (verified in `73233.*.chunk.js`) | works | works | works (NeutralFin sets `width/height/border` on `.headerUserButton` but `display:none` overrides those) | works | **HIGH** |
| 6 | Slider thumb white | `.MuiSlider-thumb` + variants | MUI runtime class | works | works | works (theme doesn't theme MUI sliders) | works | **HIGH** — but consider re-tinting on monochrome themes |
| 7a | Bg vars `:root` | `--primary-background-color`, `--background-color` | Jellyfin shell var | works (Cineplex defaults to `#181818` — we override to `#000`) | works | **HARMFUL on NeutralFin** — clamps the `#131313→#1e1e1e` gradient (see doc 11 row 8) | works | **MEDIUM** — survives technically, but defeats NeutralFin's intent. |
| 7b/7c | Bg wrappers (`html`, `body`, `.skinHeader`, `.mainAnimatedPages`, `#reactRoot`, `.dashboardDocument`) | Jellyfin shell wrappers | works (Cineplex doesn't theme these) | works (ElegantFin uses translucent wrappers — `#000` underneath is fine) | **HARMFUL** — clamps gradient + flattens `.skinHeader.semiTransparent` (see doc 11 row 10) | likely works | **MEDIUM** — and **harmful on detail pages for Cineplex** (this is what's making the 17vw band hard-black, see §2c above) |
| 8 | Settings drawer hide | `a[href*="mypreferencesmenu"]`, `[to="/mypreferencesmenu.html"]`, `:has()` parents | JF route + MUI ListItem classes | works | works | works | works | **HIGH** (if browser supports `:has()`) |
| 9 | Count badge hide | `.countIndicator` | Generic JF class | works | works | works (NeutralFin themes it, but `display:none` wins) | works | **HIGH** |
| index.html | Anti-flash inline | `html, body, .preload, .skinBody, .skinHeader, #reactRoot, .mainAnimatedPages` | Same wrappers as 7b/7c, but **pre-bundle** | works | works | **HARMFUL** — same issue as 7b/7c, but earlier in load (see doc 11 row 14) | likely | **LOW-MEDIUM** — needs `!important` removed and `.skinHeader` dropped from the list to be theme-portable |
| index.html | Submit-button red | `.raised, .button-submit, .emby-button[type=submit], button[type=submit]` | Generic JF + MUI button classes | works (matches Cineplex's `#E50914` accent) | requires recolour-aware ElegantFin (works since override is in our hands) | **HARMFUL** — paints every submit Netflix-red over a monochrome theme (see doc 11 row 15) | works | **LOW** — rule is brand-specific, must be removed when brand colour changes (NeutralFin would need `--btnSubmitColor` instead) |
| index.html | ARRFLIX shim (title/favicon/`mypreferencesmenu`) | inline `<script>` | Independent of theme | works | works | works | works | **HIGH** |
| index.html | Splash logo | `.splashLogo` | Pre-bundle JF class | works | works | works | works | **HIGH** |
**Summary:** 11 of 14 blocks are HIGH portability (theme-independent
generic JF selectors). The 3 problem children are all variations of
"force pure black background" — and they happen to be the same blocks
flagged in doc 11 as harmful to NeutralFin AND, per §2c above, to be
the cause of the hard-black detail-page band on Cineplex.
> **Operational rule:** when swapping themes, audit blocks 7a / 7b / 7c
> / index.html-anti-flash / index.html-submit-red FIRST. The other
> tweaks ride along automatically.
---
## 5. Logo aspect-ratio fit
ARRFLIX wordmark PNG: **235 × 85 px**, aspect **2.765 : 1**.
| Container | Selector | Sizing on Cineplex/Finity | Wordmark fit |
|---|---|---|---|
| Admin drawer | `.adminDrawerLogo img` | `<img>` element, `content:` swap, sized by sidebar (~240px wide) | natural — replacement is the displayed image | OK |
| Masthead | `.pageTitleWithLogo` | `<div>`, `bg-image` + `bg-size: contain` (Finity convention) | aspect preserved by `contain`, no squish | OK |
| Detail page logo | `.detailLogo` | `position: absolute; right: 25vw; top: 10vh; width: 25vw; height: 16vh; bg-size: contain` | per-show clear-logo box. ARRFLIX wordmark is not used here — this is the show's clear-logo (e.g. Rick & Morty title art). Not a fit concern for our wordmark. | OK |
| Splash | `.splashLogo` | `width:30%; height:30%; bg-size:contain; centered` | aspect preserved; on a 1920×1080 viewport renders ~576×324 box, wordmark settles at ~576×208 (height-limited by aspect). Looks correct. | OK |
**Verdict:** 235 × 85 fits cleanly in every container. Aspect ratio is
NOT a factor in any of the rendering complaints. The native JF
admin-drawer + masthead use `bg-size: contain`, so a 2.765:1 wordmark
displays without distortion regardless of theme.
---
## 6. Pre-bundle splash quality
Inspecting `web-overrides/index.html` (93 lines, the bind-mounted
override of the JF web shell):
| Aspect | Value | Notes |
|---|---|---|
| `body { background: #000 }` (declared in critical-path `<style>`) | YES | Anti-flash baseline |
| `.splashLogo` size | `width:30%; height:30%` | Centred via `position:fixed; top:50%; left:50%; transform:translate(-50%,-50%)` |
| `.splashLogo bg-image` | inlined data-URL of the 235 × 85 ARRFLIX wordmark | Same PNG as the masthead/admin drawer |
| `.splashLogo bg-size` | `contain` | Aspect preserved |
| Animation | `animation: fadein 0.5s` (defined as `@keyframes fadein { 0%{opacity:0} 100%{opacity:1} }`) | Half-second ease-in |
| Mobile vs desktop variant | `@media (min-device-width: 992px) { .splashLogo { bg-image: <data-URL> } }` | The desktop branch CURRENTLY uses **the same 235 × 85 PNG bytes** as the small/mobile branch — i.e. there is no higher-resolution desktop asset. This is a half-implemented split. Owner could supply a 470 × 170 (2x) or 940 × 340 (4x) PNG to bake into the desktop branch for sharper rendering on 1080p+ displays. |
| Screen reader / `<title>` | `<title>` is set + locked at runtime by `lockTitle()` to `"ARRFLIX"` | OK |
**Verdict:** splash is functional, fade-in is smooth, aspect is correct.
The only quality nit is the desktop `<media>` branch reading the same
small PNG as mobile — a 2× or 4× ARRFLIX wordmark in the desktop
branch would be sharper. Defer-able; not a complaint the owner has
raised.
---
## 7. Detail-page backdrop fix proposal (concrete CSS, NOT applied)
Re-stating §2d in implementation-ready form. Expected to drop into
`CustomCss` AFTER the Cineplex `@import`, BEFORE the existing
`bg:#000` blocks (which need to be **scoped out of detail pages** to
not clobber the gradient — see `body.itemDetailPage` selectors below).
```css
/* === Detail-page backdrop fix (proposal, 2026-05-08) === */
/* Convert Finity's 17vw black band into a Netflix-style gradient
overlay over a full-bleed backdrop. */
/* 1. Stretch backdrop container across the full viewport */
.layout-desktop .backdropContainer {
left: 0 !important;
width: 100vw !important;
}
/* 2. Replace Finity's mask.png with a CSS-only linear-gradient
that darkens the left ~50vw and fades to transparent.
`.backgroundContainer.withBackdrop` is the existing overlay
element in the Finity DOM. */
.layout-desktop .backgroundContainer.withBackdrop {
background-image: linear-gradient(
90deg,
rgba(0, 0, 0, 0.95) 0%,
rgba(0, 0, 0, 0.85) 25%,
rgba(0, 0, 0, 0.55) 45%,
rgba(0, 0, 0, 0.20) 65%,
rgba(0, 0, 0, 0.00) 85%
) !important;
background-size: 100vw 100vh !important;
left: 0 !important;
width: 100vw !important;
}
/* 3. UN-clamp the page bg specifically on detail pages so the
gradient composes against the actual backdrop, not pure black.
`.itemDetailPage` is added to <body> by JF on every detail
route (verified in main.jellyfin.bundle.js). */
body.itemDetailPage,
body.itemDetailPage #reactRoot,
body.itemDetailPage .mainAnimatedPages,
body.itemDetailPage .skinBody {
background-color: transparent !important;
}
```
**Before/after expectation:**
- Before: 17vw band on the left of the detail page is **flat `#000`**;
poster card hidden by Finity; title + clear-logo float on a hard
black slab.
- After: backdrop fills 100vw of the viewport. Title + logo float over
a darkened-but-visible slice of the backdrop on the left, fading to
full backdrop brightness around 70-85% across. Reads as
netflix.com's title-card style.
**Stops to tune** once live (open DevTools, edit the gradient stops):
- If title text is illegible against busy artwork, push opacity stops
up: `0.95 / 0.92 / 0.75 / 0.40 / 0.10`.
- If too much of the backdrop is darkened, pull stops left: `0.95 / 0.80 / 0.40 / 0.10 / 0.00`
with the last stop at 60%.
- If the right edge of the gradient creates a visible seam against a
bright backdrop, soften the last stop: append a sixth at
`90% rgba(0,0,0,0)` for an extra 5vw fade.
**Untested side-effects to watch for:**
- Finity hides `.detailImageContainer .card` on desktop. The fix
preserves that (poster card stays hidden — title is the focus).
If owner wants the poster card visible, drop:
```css
.layout-desktop .detailImageContainer .card { display: none }
```
by adding `.layout-desktop .detailImageContainer .card { display: revert !important }`.
- The OSD scrubber (`.itemBackdropProgressBar`) sits at the very
bottom of `.itemBackdrop`. With the backdrop now full-width, it's
also full-width (was already, just visually different against a
colour-fade vs. black band).
- Library-list pages that ALSO use the `.backgroundContainer.withBackdrop`
layer (a few in JF — backdrops on library tile rows) will get the
same gradient. If they look wrong, scope rule (1) and (2) to
`body.itemDetailPage .layout-desktop .backdropContainer` etc.
---
## 8. Recommended forward path (top 3 ranked)
### #1 — STAY on Cineplex + apply the §7 detail-page backdrop fix
**Why:** Cineplex is the only Netflix-faithful theme that runs on
JF 10.10.3 with a maintained codebase. The detail-page band is a
*single rule's worth of CSS* away from being a Netflix-style gradient
overlay. We've already invested in the brand stack (ARRFLIX wordmark,
header-icon hide, slider thumbs, Quick Connect off, settings hide); 11
of 14 personal tweaks survive the change, the other 3 (`bg:#000`)
need to be **scoped to non-detail pages** by selector chain
`body:not(.itemDetailPage)` instead of being dropped.
**Risk:** low. CSS-only, additive, no `@import` change, no
`/branding` POST hot-spot. Rolls back trivially.
**Cost:** ~30 minutes to apply, screenshot, tune gradient stops live.
### #2 — Migrate to ElegantFin v25.12.31 with ARRFLIX `#E50914` recolour
**Why:** ElegantFin's detail-page is full-width-backdrop with a
gradient overlay built in — no left band — so the §7 fix becomes
unnecessary. Most actively maintained CSS theme on JF (last commit
2026-04-30, GPL-2.0). The 04 §3e migration documented this exact
config: 8 accent variables overridden, ARRFLIX logo + cast/crew + Quick
Connect + header icons + slider thumbs all preserved.
**Risk:** medium. The previous attempt was overwritten by a sibling
Cineplex POST (race rule in 04 §3b). Personal-tweak block 7c
(`.skinHeader.semiTransparent`) still risks flattening ElegantFin's
translucent header — that block needs editing on landing.
**Cost:** ~45 minutes (re-do migration, scope the bg-clamp rules,
verify all 11 personal tweaks intact post-POST).
**Aesthetic delta vs Cineplex:** ElegantFin is "polished
Jellyseerr-y", Cineplex is "Netflix-faithful". With the recolour
ElegantFin gets the brand red but keeps a non-Netflix layout
(card design, hero strip, etc.). Owner has gone back-and-forth on this
preference — explicitly chose Cineplex this morning.
### #3 — Hybrid: keep Cineplex import + graft NeutralFin's `--gradientPoint` vars
**Why:** for owners who like Cineplex's red+webfont but want
NeutralFin's depth/gradient on backgrounds. Manually copy NeutralFin's
`--darkerGradientPoint #131313 / --lighterGradientPoint #1e1e1e` into a
`:root` block, drop our `--primary-background-color: #000 !important`
overrides, and let the gradient render.
**Risk:** higher than #1 or #2. Variables don't compose perfectly
across themes — Cineplex's Finity parent doesn't read those NeutralFin
vars, it reads its own `--theme-background-color`. So you'd actually
copy the values into Finity's variable: `--theme-background-color: linear-gradient(...)`
which CSS doesn't allow on a plain `background-color`. Real grafting
needs `body { background-image: linear-gradient(180deg, #131313, #1e1e1e) }`
plus dropping the `bg:#000 !important` rules.
**Cost:** ~60 min trial-and-error. Likely lower visual reward than #1.
**Verdict:** Recommended order is **#1 first (lowest risk, biggest
backdrop win), then #2 if owner re-evaluates Netflix-fidelity vs
polish, #3 only as a fall-back if #1 doesn't read well**.
---
## 9. Risks + rollback
### Snapshot tag
`snapshot-2026-05-08-pre-elegantfin` — captured before the ElegantFin
attempt. Currently this is **also the rollback point for any further
theme work** because ElegantFin → NeutralFin → Cineplex have all been
applied (and reverted) on top of it. Located at
`snapshots/2026-05-08-pre-elegantfin/`.
If a future change wants its own snapshot, follow the pattern in
`RESTORE.md`: capture `branding.json`, `index.html`, all
`displayprefs-*.json`, `users.json`, `libraries.json`, write a new
`RESTORE.md`, tag the commit.
### Prior failed swaps (timeline 2026-05-08)
| Time | Theme attempted | Outcome |
|---|---|---|
| early today | ElegantFin v25.12.31 (initial pick — pre-Netflix-brief) | replaced by Cineplex when owner asked for Netflix-faithful |
| mid-day | **Cineplex v1.0.6** | applied, working |
| later | ElegantFin v25.12.31 + ARRFLIX recolour (04 §3e) | applied, then silently overwritten by a sibling Cineplex POST (race rule, see 04 §3b) |
| even later | NeutralFin v1.3.0 | applied, but a sibling Cineplex POST overwrote it minutes later (see doc 11 headline finding); also, our `bg:#000 !important` rules clamped its gradient flat so the brief render that DID land looked wrong |
| now | **Cineplex v1.0.6** | active (verified live this audit) |
### Race-rule reminder
`/System/Configuration/branding` takes a complete object on every
POST; whichever POST lands last wins. Per 04 §3b: any agent or script
touching this endpoint MUST `GET → edit-only-its-fields → POST` and
the branding POST must be the **last** in any sequence.
### Detail-page fix rollback
If §7's CSS lands and looks wrong, remove the three new blocks from
`CustomCss` and POST `branding`. The §7 proposal is purely additive
(no rule removal); revert is a clean delete.
---
## 10. What was NOT touched during this audit
- No POST to `/System/Configuration/branding`.
- No edit to `web-overrides/index.html` or the bind-mounted
`/jellyfin/jellyfin-web/index.html`.
- No `docker compose` action, no container restart.
- No git commit on `snapshots/`, no tag movement.
- All inspections were `curl` GET (`/Branding/Configuration` +
`/System/Configuration/branding`) and `docker exec jellyfin sh -c`
bounded to `cat`/`grep`/`wc`/`ls`.
---
## 11. Sign-off
- **Auditor:** s8n (audit pass, 2026-05-08)
- **Live theme at audit time:** Cineplex v1.0.6 (verified —
`/Branding/Configuration` returns `MRunkehl/cineplex@v1.0.6`)
- **Top likely cause of detail-page black band:** Finity (Cineplex's
parent) ships `--detail-page-backdrop-offset: 17%` by design. Our
`bg:#000 !important` rules turn that intentionally-clear 17vw band
into a hard-black slab. The Finity `mask.png` overlay would have
softened it into a gradient if it loads — worth a DevTools network
check.
- **Recommended forward path:** STAY on Cineplex + apply §7
detail-page CSS (full-bleed backdrop + linear-gradient overlay,
scoped to `body.itemDetailPage`).
- **Personal-tweak portability:** **HIGH** for 11 of 14 blocks; **MEDIUM/LOW**
for the 3 `bg:#000` blocks (must be scoped/dropped on theme swap).
- **Next step:** owner reviews this doc + screenshots the current
detail-page band, decides whether to apply §7. No work on the live
server until that review.

View file

@ -1,214 +0,0 @@
# 15 - Force English UI for All Users
> Why "Abspielen" showed up on the Play button, every place locale comes from,
> and the per-user mechanism (plus wrapper update) that pins every account
> to English regardless of what `Accept-Language` the browser sends.
Last verified: 2026-05-08 against Jellyfin 10.10.3 web bundle, arrflix.s8n.ru.
---
## Status as of 2026-05-08 — superseded by lockdown sweep
This doc captured the first pass: identifying that `Configuration.UICulture`
was the per-user knob, building `bin/force-english-all-users.sh`, and
patching `bin/add-jellyfin-user.sh`. That was a partial fix — it pinned the
existing five accounts but did not cover server-wide defaults, the web SPA
pre-auth bundle, or a re-apply mechanism that survives Jellyfin restarts /
new users created out-of-band / config drift over time.
A multi-agent lockdown sweep ran 2026-05-08 to close the remaining gaps:
- **Audit baseline:** `docs/19-english-only-audit.md` — every surface
inventoried, current state per layer, "still drifts" notes.
- **Lockdown procedure + persistence:** `docs/20-english-only-lockdown.md`
the canonical operator doc going forward. Covers server / per-user / web
SPA / Accept-Language layers, ships the idempotent re-apply runner at
`bin/english-lockdown-runner.sh`, and documents the systemd timer the
operator can drop in if they want weekly auto re-application.
- **Web-side overrides:** `web-overrides/english-lockdown.{js,css}` — pin
`navigator.language`, hide the language switcher, force-load the en-us
bundle pre-auth. (Sibling agent, separate commit.)
- **Live server settings:** UICulture + PreferredMetadataLanguage +
MetadataCountryCode pushed to the live `arrflix.s8n.ru` server config.
(Sibling agent, separate commit.)
The body below is preserved verbatim as historical context for **why** the
per-user POST mechanism exists. For day-to-day operations, jump to
`docs/20-english-only-lockdown.md`.
---
## TL;DR
- Owner saw German "Abspielen" on the detail-page Play button.
- Root cause: **every Jellyfin user on this server has `Configuration.UICulture` unset**
(key is absent from `GET /Users/{id}` JSON, not just empty string). When that
field is missing, the Jellyfin web SPA falls back to the browser's
`Accept-Language` header. A browser sending `de-*` → German UI.
- There is **no server-side flag** that forces the web client to ignore
`Accept-Language`. Locale is per-user.
- Fix: `POST /Users/{id}/Configuration` with `UICulture` pinned to `"en-US"`
for every existing user, and update `bin/add-jellyfin-user.sh` so future
users get the same pin baked in at creation time.
---
## Where Jellyfin gets UI language from (priority order)
The Jellyfin web client (`/web/index.html` SPA) selects its UI language in
this exact order, first hit wins:
| # | Source | Where it lives | Notes |
|---|--------|----------------|-------|
| 1 | **Per-user `Configuration.UICulture`** | `GET /Users/{id}` JSON, field `Configuration.UICulture` | Authoritative once a user is logged in. Set to `"en-US"` to pin English. |
| 2 | **Browser `Accept-Language`** | HTTP request header, sent by every browser | Fallback when (1) is unset / empty / absent. This is what bit us — Marco's browser sends `de-DE,de;q=0.9,en` and Jellyfin honored it. |
| 3 | **Server `UICulture`** in `/System/Configuration` | Server-wide JSON, current value `"en-US"` | This is the **dashboard / admin** default, NOT applied to user UI. Misleading: setting it does NOT propagate down to clients. |
| 4 | **Pre-auth splash bundle strings** | Static strings in the JS bundle's `en-us.json`/`de.json` | Loaded based on `Accept-Language` BEFORE the user is even authed. Cannot be overridden per-user — see "Limits" below. |
There is **no** `customPrefs.language` key in `DisplayPreferences` — locale is
not stored there. Confirmed by inspecting marco's `DisplayPreferences/usersettings`:
`CustomPrefs` has only `chromecastVersion`, `dashboardTheme`, home sections,
skip lengths, `tvhome`. No language.
There is **no** `EnableNonAdministrativeUserLocaleOverride` or
`EnforcedDisplayLanguage` flag in `/System/Configuration`. Verified via
filtering the full server config for `lang|locale|culture|country` keys —
only `PreferredMetadataLanguage`, `MetadataCountryCode`, and `UICulture`
exist, and `UICulture` server-side is the dashboard-only default.
---
## Per-user state (current)
Audit run 2026-05-08, all 5 users:
| User | UserId | `Configuration.UICulture` |
|------|--------|---------------------------|
| 5 | `571decc67cdc4ea683b4c936b0a31ff8` | **key absent** |
| guest | `82dd8542915740c8ae799b6723542c63` | **key absent** |
| house | `a4cbcdf95bb34888885af6fbf5c340d1` | **key absent** |
| marco | `d787fbfc373a44119a247e7406b2721e` | **key absent** |
| s8n | `2be0f0d3fe3a45dc9298138a15a01925` | **key absent** |
Every account is currently at the mercy of the browser. Whichever browser
hits arrflix.s8n.ru with `Accept-Language: de-*` will see German strings
(Play → Abspielen, Resume → Fortsetzen, etc.). The Play button screenshot
the owner shared is almost certainly Marco logged in from a German-locale
browser, or any user logged in from such a browser at all.
---
## Forcing mechanism — per-user POST
The web client reads `UICulture` straight from the user object on auth and
on every refresh. Setting it to `"en-US"` pins the UI to English regardless
of what the browser asks for.
**Endpoint:** `POST /Users/{userId}/Configuration` (returns 204).
**Payload:** the FULL existing `Configuration` block with `UICulture` added
(Jellyfin replaces the whole config dict, it does not patch fields). Fetch
first, modify, POST back — the same read-modify-write pattern step [3/4]
of `add-jellyfin-user.sh` already uses.
**Reference curl** (single user, marco):
```bash
TOKEN=<JELLYFIN_API_TOKEN>
USER_ID=d787fbfc373a44119a247e7406b2721e
curl -s "https://arrflix.s8n.ru/Users/$USER_ID" \
-H "Authorization: MediaBrowser Token=$TOKEN" > /tmp/u.json
python3 -c "
import json
with open('/tmp/u.json') as f: u = json.load(f)
c = u['Configuration']
c['UICulture'] = 'en-US'
print(json.dumps(c))
" > /tmp/u-fixed.json
curl -s -X POST "https://arrflix.s8n.ru/Users/$USER_ID/Configuration" \
-H "Authorization: MediaBrowser Token=$TOKEN" \
-H "Content-Type: application/json" \
--data-binary @/tmp/u-fixed.json -w "%{http_code}\n" -o /dev/null
# Expect: 204
```
The convenience wrapper for all 5 users in one go is at
`bin/force-english-all-users.sh` — read-modify-write loop, idempotent, prints
each user's before/after state.
---
## Wrapper update for future users
`bin/add-jellyfin-user.sh` step `[3/4]` currently sets
`SubtitleMode`/`SubtitleLanguagePreference`/`AudioLanguagePreference`/
`PlayDefaultAudioTrack` on the new user's `Configuration`. Add `UICulture`
to that same block:
```python
c['SubtitleMode'] = 'Default'
c['SubtitleLanguagePreference'] = 'eng'
c['AudioLanguagePreference'] = 'eng'
c['PlayDefaultAudioTrack'] = True
c['UICulture'] = 'en-US' # NEW: pin UI to English regardless of browser Accept-Language
```
That is a one-line addition; the rest of the wrapper is untouched.
---
## What CANNOT be forced (limits)
1. **Pre-auth splash bundle strings.** Before the user logs in, the web SPA
loads a translation file based on `navigator.language` / browser
`Accept-Language`. The `<title>`, the login form labels, "Sign In",
"Username", "Password" placeholder text, and the loading splash all
resolve from that pre-auth bundle. If the browser is German, those
handful of strings render in German until the user authenticates and
the per-user `UICulture` kicks in.
This is a fundamental architectural limit — there is no server flag that
tells the SPA to ignore `navigator.language`. Workarounds would require
either (a) a runtime shim that overrides `navigator.language` before the
bundle initialises (similar to the existing `inject-shim.py` title
locker), or (b) replacing the German `de.json` translation file in the
web bundle with the English copy. Neither is implemented; both are
in-scope for future work if pre-auth German strings ever become a
complaint.
2. **Reverse-proxy doesn't strip `Accept-Language`.** Traefik passes the
header through unchanged. We could in theory rewrite it to `en-US` at
the proxy, but that breaks any user who genuinely wants a non-English
metadata locale for OTHER apps fronted by the same Traefik (none
currently — but the principle stands). Per-user `UICulture` is cleaner.
3. **Subtitle/audio language preferences** are already pinned to `eng` for
every user via the wrapper, so playback selection is unaffected by
`UICulture`. We are only fixing the **UI chrome** (button labels,
menus, tooltips) here, not media language defaults.
4. **Native mobile clients** (Jellyfin Android/iOS apps) read `UICulture`
the same way the web SPA does, so they will also pick up the pin once
the per-user POST lands. Verified by reading Jellyfin source: same
`User.Configuration.UICulture` field is the authoritative locale on
every official client.
---
## Cleanup steps (owner-triggered)
1. Review this doc and `bin/force-english-all-users.sh`.
2. Run the script with the admin token in env:
```
JELLYFIN_TOKEN=<JELLYFIN_API_TOKEN> bin/force-english-all-users.sh
```
3. Hard-refresh each browser (Ctrl-Shift-R) to clear any cached locale
bundle the SPA loaded on previous visit.
4. Verify by visiting any movie detail page — the button should now read
"Play" in every browser, including ones still sending `de-*`.
5. Apply the wrapper diff to `bin/add-jellyfin-user.sh` so future users
inherit the pin.
No container restart needed. No web bundle rebuild needed. No reverse-proxy
config change needed.

View file

@ -1,476 +0,0 @@
# 16 - Jellyfin Branding Leaks (Read-Only Audit)
> Owner wants ALL Jellyfin branding hidden user-side. This doc inventories every
> place a logged-in non-admin still sees the word "Jellyfin" or the
> teal/purple triangle logo, and proposes a concrete fix for each.
Last verified: 2026-05-08 against live `https://arrflix.s8n.ru` running
Jellyfin 10.10.3 (`jellyfin/jellyfin` image). Probe account: `marco`
(non-admin, `EnableUserPreferenceAccess=false`).
This doc is **read-only**. No CSS POSTs, no bundle edits, no service
restarts performed. Implementation is a follow-up branch.
---
## TL;DR — counts
| Surface | Reachable as non-admin? | Raw "Jellyfin" mentions |
|---|---|---|
| `index.html` (live, bind-mount) | Yes | 0 (already shimmed: title, app-name, favicon, splashLogo) |
| PWA manifest `fd4301fdc170fd202474.json` | Yes (PWA install + iOS Safari add-to-home + Android install prompt) | **2** (`name`, `short_name`) |
| en-us i18n chunk | Yes (3 entries reachable; 19 are admin/dashboard/wizard) | 22 keys, **3 user-reachable** |
| `main.jellyfin.bundle.js` literals | Edge | 2 (`appName():"Jellyfin Web"` not visible; one error-route phrase) |
| Logo screensaver (`banner-light.png`) | Yes (idle timeout, default 3min) | 1 image asset |
| Apple-touch-startup-image splash PNGs | Yes (iOS Safari "Add to Home" PWA only) | ~20 images |
| Service worker registration message | No | 0 (clean — no JF strings) |
| chromecastPlayer plugin chunk | No (we hide cast btn; chunk only loads if cast invoked) | 0 |
| Browser tab title / favicon | No | 0 (already locked by shim) |
**Recommended fix path:** **CSS hide + JS shim + manifest bind-mount.** No bundle modifications. CSS alone is insufficient (manifest, i18n, screensaver image are CSS-invisible).
---
## Already-fixed (don't redo)
| Surface | Mechanism | Doc |
|---|---|---|
| `<title>Jellyfin</title>` overwrite by SPA | `lockTitle()` regex shim | `10-spa-runtime-shim.md` |
| `<link rel="icon">` Jellyfin teal triangle | Embedded data-URL favicon + `lockFavicon()` | 10 |
| `<meta name="application-name" content="Jellyfin">` | Static replace in bind-mounted index.html (`content="ARRFLIX"`) | 10 |
| `.splashLogo` (login chrome top-left) | Image swap in bind-mounted index.html | 10 |
| `.adminDrawerLogo img` + `.pageTitleWithLogo` | CustomCss `content: url(data:image/png;base64,…)` | `04-theming-and-users.md` §3b |
| Pre-bundle login flash (blue button, dark blue bg) | Inline `<style>` block in bind-mounted index.html | 10 |
| Settings drawer entry (only admin should see) | CustomCss `:has()` rules + JS `nukeSettings()` MutationObserver | 10 |
| Quick Connect button | CustomCss `.btnQuick { display:none }` + server-side disabled | 04 |
| Cast / SyncPlay / User header icons | CustomCss `.headerCastButton` etc. | 04 |
Confirmed live (2026-05-08, marco session):
```
GET /web/index.html → <title>ARRFLIX</title>
<meta name="application-name" content="ARRFLIX">
<link rel="apple-touch-icon" sizes="180x180" href="data:image/png;base64,…"> (ARRFLIX logo)
ARRFLIX-SHIM-BEGIN block present and runs.
GET /Branding/Configuration → CustomCss includes Cineplex + ARRFLIX overrides as expected.
```
---
## Findings — by severity
### S1 visible-everywhere (PWA + idle screensaver)
#### F1 — PWA manifest `name` and `short_name` are "Jellyfin"
- **Location:** `https://arrflix.s8n.ru/web/fd4301fdc170fd202474.json`
- **Live payload:**
```json
{ "name": "Jellyfin", "description": "The Free Software Media System",
"short_name": "Jellyfin", "start_url": "index.html#/home.html",
"theme_color": "#101010", "background_color": "#101010",
"icons": [ { "src": "touchicon72.png" }, …, { "src": "touchicon512.png" } ] }
```
- **User-visible where:**
- Android Chrome: install prompt label, home screen shortcut name, app drawer name.
- iOS Safari "Add to Home Screen": shortcut label.
- Desktop Chrome/Edge: "Install ARRFLIX" / install card title.
- Browser PWA badge (`navigator.getInstalledRelatedApps()`-style surfaces).
- **Fix mechanism:** **Bind-mount manifest** (the static index.html bind-mount is already proven to work). Replace `name`/`short_name` with `ARRFLIX`. Optionally clear `description` or set to a neutral string. Touchicon images already replaced via the data-URL `apple-touch-icon` patch in index.html, BUT the manifest still references `touchicon{72,114,144,512}.png` which are Jellyfin-branded PNGs on disk. We can either (a) bind-mount replacement PNGs, or (b) point the manifest icons array at our data URL via inline data-URI refs (Chrome accepts `"src": "data:image/png;base64,…"`).
- **Risk:** Low. Manifest is static JSON; nothing else parses it. Browser fetches manifest on install; if file is bind-mounted RO, container reads on each request just like index.html (same compose pattern, same inode-pin gotcha — see `10-spa-runtime-shim.md` §"Single-file bind mount inode gotcha").
- **Replacement file (proposed `web-overrides/fd4301fdc170fd202474.json`):**
```json
{
"name": "ARRFLIX",
"description": "ARRFLIX",
"lang": "en-US",
"short_name": "ARRFLIX",
"start_url": "index.html#/home.html",
"theme_color": "#000000",
"background_color": "#000000",
"display": "standalone",
"icons": [
{ "sizes": "72x72", "src": "touchicon72.png", "type": "image/png" },
{ "sizes": "114x114", "src": "touchicon114.png", "type": "image/png" },
{ "sizes": "144x144", "src": "touchicon144.png", "type": "image/png" },
{ "sizes": "512x512", "src": "touchicon512.png", "type": "image/png" }
]
}
```
(touchicon\*.png images are a separate Phase-2 swap — see F4.)
#### F2 — Logo screensaver shows Jellyfin banner on idle
- **Location:** `/web/logoScreensaver-plugin.8edf3eac91e564799c27.chunk.js`
injects `<img src="assets/img/banner-light.png">` into a `.logoScreenSaver` div
on idle timeout.
- **Live trigger:** Default screensaver kicks in after the user idles on any
page. Plays bouncing/spinning Jellyfin banner animation.
- **Fix mechanism options:**
1. **Server-side disable** (best): in user policy or server config, disable
the logo screensaver / set screensaver to "None". Confirmed reachable via
`Configuration` API. Do this for the system default; non-admins can't
override since their preferences are locked.
2. **CSS hide** (always works): append to CustomCss
```css
.logoScreenSaver, .logoScreenSaverImage { display: none !important; }
```
The screensaver div still mounts but renders nothing. Visually this
means a black overlay on idle (acceptable).
3. **CSS image swap** (ARRFLIX-branded screensaver):
```css
.logoScreenSaverImage { content: url("data:image/png;base64,<ARRFLIX>") !important; }
```
Reuses the same data URL we already inject in CustomCss for
`.adminDrawerLogo img`.
- **Risk:** Low. Screensaver is a presentation-only plugin; hiding it does
not break navigation, hotkeys, or playback. Option 3 is purely cosmetic.
- **Recommendation:** Option 1 (disable) + Option 2 (CSS belt) for defense
in depth.
---
### S2 detail-only / per-action (i18n strings)
#### F3 — i18n strings rendered to non-admin in error / playback paths
22 i18n keys in `en-us-json.667484b4a441712c7e05.chunk.js` contain "Jellyfin".
Of those, **3 are reachable as a non-admin user**:
| Key | String | When shown |
|---|---|---|
| `PlaybackErrorPlaceHolder` | "This is a placeholder for physical media that **Jellyfin** cannot play. Please insert the disc to play." | Player attempts to play a placeholder/disc-only item. Rare for an arr-fed library but possible. |
| `UnsupportedPlayback` | "**Jellyfin** cannot decrypt content protected by DRM but all content will be tried regardless, including protected titles. Some files may appear completely black due to encryption or other unsupported features, such as interactive titles." | DRM playback fallback dialog. Rare. |
| `MessageChromecastConnectionError` | "Your Google Cast receiver is unable to contact the **Jellyfin** server. Please check the connection and try again." | Cast initiation fails. We hide cast button so this is now functionally unreachable, but the keystrokes for cast can still be invoked from desktop browsers via media keys. |
The remaining 19 keys (`AllowStreamSharingHelp`, `EncodingFormatHelp`,
`ErrorAddingMediaPathToVirtualFolder`, `ErrorDeletingItem`, `ErrorDeletingLyrics`,
`KnownProxiesHelp`, `LabelAutomaticDiscoveryHelp`, `LabelDisplayLanguageHelp`,
`LabelPublishedServerUriHelp`, `MessageConfirmRestart`, `MessageDirectoryPickerBSDInstruction`,
`PleaseRestartServerName`, `ServerRestartNeededAfterPluginInstall`, `UserProfilesIntro`,
`WelcomeToProject`, `WizardCompleted`, `WriteAccessRequired`, `XmlTvPathHelp`,
`ConfirmEndPlayerSession`) are admin-only — Dashboard, setup wizard, plugin
manager, virtual folder management, restart confirms, encoding settings.
Non-admins cannot reach those routes (server policy + drawer hides + we
already strip the Settings link).
- **Fix mechanism:** **JS shim with MutationObserver** that walks DOM text
nodes and rewrites `Jellyfin → ARRFLIX`. Snippet appended to
`bin/inject-shim.py`:
```js
function rewriteJellyfinText(){
try {
var WORD = /\bJellyfin\b/g;
var walker = document.createTreeWalker(document.body, NodeFilter.SHOW_TEXT, null);
var n;
while ((n = walker.nextNode())) {
if (n.nodeValue && WORD.test(n.nodeValue)) {
n.nodeValue = n.nodeValue.replace(WORD, 'ARRFLIX');
}
}
} catch(e){}
}
// Wire into start():
// - call once at start()
// - call from body MutationObserver
// - call from setInterval safety net (1s)
```
- **Risk:**
- Performance: full-document text walk on every DOM mutation is O(N).
Mitigate by debouncing (run only if mutation contains added/removed
text nodes; use `requestIdleCallback`).
- False positives: rewriting text inside `<input>` value or `<textarea>`
— none of these strings live there, so safe.
- i18n drift on JF upgrade: if upstream renames the keys, this is still
safe (string-level rewrite, not key-level).
- Aria-labels and `title` attributes are NOT covered by `SHOW_TEXT`.
Add a separate pass that walks `[aria-label*="Jellyfin"]` and
`[title*="Jellyfin"]` if any surface needs it (none observed in audit).
- **Why not bind-mount the en-us-json chunk:** filename is content-hashed
(`en-us-json.667484b4a441712c7e05.chunk.js`). Every JF release bumps the
hash and the bind-mount becomes a 404. Fragile. JS shim wins.
---
### S3 edge / iOS-only
#### F4 — Apple PWA splash images and touchicon\*.png
- **Location:** `/web/{6a2e2e6b4186720e5d4f.png, eb8bef…, 3fa90c…, …}`
20 different `apple-touch-startup-image` PNGs declared in `index.html`,
plus `/web/touchicon{72,114,144,512}.png` referenced from manifest.
- **User-visible where:** iOS Safari "Add to Home Screen" install + launch
splash. Android Chrome icon-only fallback if data-URL fails (rare).
- **Fix mechanism:**
- **Phase 1 (cheap, ~70% covered):** Bind-mount the manifest (F1) so
`touchicon*.png` references can be redirected to data URLs in the
icons array. iOS Safari ignores those, but Android picks them up.
- **Phase 2 (full coverage):** Generate ARRFLIX-branded PNGs at the
20 device resolutions the apple-touch-startup-image media queries
expect, and bind-mount them under their content-hash filenames (`6a2e2e6b…png` etc.). Brittle — JF rebuilds rotate hashes.
- **Pragmatic alternative:** strip apple-touch-startup-image entries
from the bind-mounted index.html entirely. iOS will fall back to a
blank splash with the (already-ARRFLIX) apple-touch-icon. Loses the
"polished install splash" but kills the leak.
- **Risk:** Low. iOS PWA install rate on a private invite-only service
is a tiny fraction of sessions. Defer until owner reports actual
user friction.
- **Recommendation:** Defer. The PWA install path is rare enough on a
desktop/laptop-dominant private service that this is a Phase 3 polish.
#### F5 — `main.jellyfin.bundle.js` literal "Jellyfin Web" appName + error-route phrase
- **Location 1:** `AppHost.appName():"Jellyfin Web"` — sent in
`X-Emby-Authorization: MediaBrowser Client="Jellyfin Web"` header on
every API call. NOT user-visible chrome. Visible only in the user's
Devices list (which they can't reach since `EnableUserPreferenceAccess=false`)
and in the admin Dashboard "Active Devices" view. Non-admin: zero
exposure.
- **Location 2:** `"working in a future Jellyfin update."` — embedded in
the deprecated/removed-route React component (`/web/#/some-old-path`).
Reachable only via stale bookmark to a removed route. Edge.
- **Fix mechanism:** None. Bundle modifications are explicitly out of
scope (`CONSTRAINTS: no bundle modifications`). Both leaks are
non-admin-invisible in normal flow.
- **Risk of fixing:** rewriting `main.jellyfin.bundle.js` would break
source-map verification, JF auto-updates, and would have to be redone
every image bump. Not worth it.
---
## Recommended fix order
| # | Fix | Effort | User-visible win |
|---|---|---|---|
| 1 | **Manifest bind-mount** (F1) | 5 min | Eliminates "Jellyfin" from PWA install + home-screen + app drawer. |
| 2 | **Disable logo screensaver** server-side + CSS belt (F2) | 5 min | Eliminates Jellyfin banner during idle (currently the most-visible animated leak). |
| 3 | **DOM text-rewrite shim** for `Jellyfin → ARRFLIX` (F3) | 15 min | Catches all 22 i18n keys + any future JF upgrade leaks; covers playback errors and unreachable admin paths defensively. |
| 4 | **Apple splash + touchicon swap** (F4) | 1-2h (image gen) | iOS PWA install polish. Defer. |
| 5 | **Bundle literals** (F5) | N/A | Skip — non-admin-invisible. |
Phases 1-3 give 100% coverage for non-admin chrome. Phase 4 polishes the iOS install path. Phase 5 is out of scope.
---
## Implementation plan — concrete snippets
### Snippet A — manifest bind-mount
Add `web-overrides/fd4301fdc170fd202474.json` (full file body in F1 above).
Compose volume:
```yaml
volumes:
- /opt/docker/jellyfin/web-overrides/index.html:/jellyfin/jellyfin-web/index.html:ro
- /opt/docker/jellyfin/web-overrides/fd4301fdc170fd202474.json:/jellyfin/jellyfin-web/fd4301fdc170fd202474.json:ro
```
Deploy (no container restart needed):
```bash
scp /tmp/ARRFLIX/web-overrides/fd4301fdc170fd202474.json \
user@192.168.0.100:/opt/docker/jellyfin/web-overrides/fd4301fdc170fd202474.json
curl -ks https://arrflix.s8n.ru/web/fd4301fdc170fd202474.json | jq -r .name # expect "ARRFLIX"
```
**Inode-pin gotcha:** scp's `truncate-then-write` is safe; rsync via temp-file
+ rename will orphan the bind. Same rule as index.html (see doc 10).
**Hash-rotation gotcha:** if a future JF image bumps the manifest filename
hash, this bind path 404s. Verify after every image upgrade:
```bash
curl -ks https://arrflix.s8n.ru/web/index.html | grep -oE 'rel="manifest" href="[^"]*"'
# expect href="fd4301fdc170fd202474.json" — if changed, rename bind file.
```
### Snippet B — screensaver disable + CSS belt
Server-side (one-time as admin):
```bash
TOKEN=<admin token>
# Disable default screensaver via /System/Configuration:
curl -ks -X POST https://arrflix.s8n.ru/System/Configuration \
-H "X-Emby-Token: $TOKEN" -H "Content-Type: application/json" \
-d '{"DefaultScreensaverPlugin":"none"}'
```
CSS belt (append to CustomCss via existing `04-theming-and-users.md` workflow):
```css
/* Hide Jellyfin logo screensaver — 2026-05-08 (doc 16) */
.logoScreenSaver,
.logoScreenSaverImage { display: none !important; }
```
### Snippet C — DOM text-rewrite shim (covers F3)
Append to the IIFE in `bin/inject-shim.py`, between `nukeSettings` and
`start`:
```js
var JF_WORD = /\bJellyfin\b/g;
function rewriteJellyfinText(root){
try {
var r = root || document.body;
if (!r) return;
var w = document.createTreeWalker(r, NodeFilter.SHOW_TEXT, {
acceptNode: function(n){
var p = n.parentNode;
if (!p) return NodeFilter.FILTER_REJECT;
var tag = p.nodeName;
// Skip <script>, <style>, <textarea>, <input> contents
if (tag === 'SCRIPT' || tag === 'STYLE' || tag === 'TEXTAREA' || tag === 'INPUT') {
return NodeFilter.FILTER_REJECT;
}
return JF_WORD.test(n.nodeValue) ? NodeFilter.FILTER_ACCEPT : NodeFilter.FILTER_REJECT;
}
});
var n;
while ((n = w.nextNode())) {
n.nodeValue = n.nodeValue.replace(JF_WORD, 'ARRFLIX');
}
// aria-label / title attributes
var attrEls = r.querySelectorAll('[aria-label*="Jellyfin"], [title*="Jellyfin"]');
for (var i = 0; i < attrEls.length; i++) {
var el = attrEls[i];
if (el.getAttribute('aria-label')) {
el.setAttribute('aria-label', el.getAttribute('aria-label').replace(JF_WORD, 'ARRFLIX'));
}
if (el.getAttribute('title')) {
el.setAttribute('title', el.getAttribute('title').replace(JF_WORD, 'ARRFLIX'));
}
}
} catch(e){}
}
```
Wire into `start()`:
```js
function start(){
lockTitle(); lockFavicon(); nukeSettings(); rewriteJellyfinText();
// … existing head observer …
if (document.body && window.MutationObserver) {
new MutationObserver(function(muts){
nukeSettings();
// Only re-walk if a mutation added text — avoid full-doc walk on every keystroke
var dirty = false;
for (var i = 0; i < muts.length && !dirty; i++) {
var m = muts[i];
if (m.addedNodes && m.addedNodes.length) dirty = true;
else if (m.type === 'characterData') dirty = true;
}
if (dirty) rewriteJellyfinText();
}).observe(document.body, { childList:true, subtree:true, characterData:true });
}
setInterval(function(){
/* … existing … */
rewriteJellyfinText();
}, 1000);
}
```
**Performance:** `acceptNode` filter rejects non-matching nodes O(1) per
node, so the walker is cheap. Adding/removing list items in a 5000-item
library scroll triggers ~5000 reject calls per render frame, which is
sub-ms in Chromium. No `requestIdleCallback` needed for this scale.
**Why not just text-replace the whole document body markup string in place:**
that approach destroys all React event listeners and breaks navigation.
The TreeWalker approach mutates only `nodeValue` on already-rendered text
nodes, so React's reconciler is undisturbed.
### Snippet D — defer-but-noted: touchicon\*.png
Phase 4. Generate ARRFLIX-branded PNGs at 72/114/144/512 px and bind-mount
each:
```yaml
- /opt/docker/jellyfin/web-overrides/touchicon72.png:/jellyfin/jellyfin-web/touchicon72.png:ro
- /opt/docker/jellyfin/web-overrides/touchicon114.png:/jellyfin/jellyfin-web/touchicon114.png:ro
- /opt/docker/jellyfin/web-overrides/touchicon144.png:/jellyfin/jellyfin-web/touchicon144.png:ro
- /opt/docker/jellyfin/web-overrides/touchicon512.png:/jellyfin/jellyfin-web/touchicon512.png:ro
```
These four filenames are *not* content-hashed, so the bind survives JF
upgrades.
The 20 apple-touch-startup-image PNGs *are* content-hashed; skip those
or strip their `<link>` tags from the bind-mounted index.html.
---
## i18n shim vs bundle bind-mount — why we choose shim
| Approach | Survives JF upgrade? | Effort/upgrade | Fragility |
|---|---|---|---|
| Bind-mount `en-us-json.<hash>.chunk.js` | No (filename rotates each release) | Re-extract + re-mount each upgrade | High |
| DOM text-rewrite shim (chosen) | Yes | Zero | Low — string-level rewrite, key-agnostic |
| Override-language-pack server config | Partially (only changes display lang, doesn't strip "Jellyfin" from custom strings) | One-time | Doesn't fix the leak |
| Custom branding in `LoginDisclaimer` (already used) | N/A — only affects login screen disclaimer | One-time | Already in place; doesn't touch other strings |
The shim is the only non-fragile, upgrade-immune solution short of forking
the bundle.
---
## PWA manifest gotcha — flagged
The owner asked specifically: "If the manifest contains `name:Jellyfin`,
propose an override approach (bind-mount a custom manifest.json)."
**Confirmed: Yes, manifest contains `"name":"Jellyfin"` and `"short_name":"Jellyfin"`.**
Override approach: bind-mount the file as in Snippet A. The compose
config is already set up for the same pattern (index.html). One additional
volume line. The only new risk is the hash-rotation case — record the
filename in `web-overrides/README.md` and grep-verify after every JF image
bump.
---
## Out-of-scope notes
- **`description: "The Free Software Media System"`** in the manifest is
a Jellyfin-project tagline, not the literal "Jellyfin" word. Owner asked
for "Jellyfin" specifically; the description is replaced in our
proposed manifest anyway (set to "ARRFLIX").
- **`assets/img/banner-dark.png`** is not user-reachable as non-admin
(would only render in admin theme previews). Skip.
- **`fresh.svg` / `rotten.svg`** (Rotten Tomatoes) are not Jellyfin-branded.
Already handled by Cineplex CSS. Skip.
- **`avatar.png`** is the default user avatar (generic person icon) — not
Jellyfin-branded. Skip.
---
## Verification post-fix
After deploying Phase 1-3, re-run this audit and confirm:
```bash
# F1 — manifest
curl -ks https://arrflix.s8n.ru/web/fd4301fdc170fd202474.json | jq -r '.name, .short_name'
# expect: ARRFLIX / ARRFLIX
# F2 — screensaver
TOKEN=<admin>
curl -ks https://arrflix.s8n.ru/System/Configuration -H "X-Emby-Token: $TOKEN" | jq -r '.DefaultScreensaverPlugin'
# expect: "none" (or empty)
# F3 — i18n shim
# Manual: Open DevTools console, run:
# document.title.includes('Jellyfin') || document.body.innerText.includes('Jellyfin')
# expect: false
# Belt: any-Jellyfin-anywhere check
curl -ks https://arrflix.s8n.ru/web/index.html | grep -ohE '\bJellyfin\b' | wc -l
# expect: occurrences only in shim regex source (not in user-visible chrome)
```
---
## Sign-off
- **Audit run by:** s8n, 2026-05-08, non-admin session as `marco`.
- **Mode:** read-only. No CSS POSTs, no bundle edits, no service restarts.
- **Live state:** index.html shim active and correct; manifest leak confirmed; screensaver leak confirmed; i18n leaks confirmed (3 reachable / 22 total in en-us chunk).
- **Recommended next action:** implement Phase 1 (manifest bind-mount) +
Phase 2 (screensaver disable + CSS belt) in a single follow-up branch;
Phase 3 (DOM text shim) in a separate branch since it touches the
critical inject-shim.py path and warrants its own verification.

View file

@ -1,474 +0,0 @@
# 17 - Dev Mirror + Settings Drawer Leak Diagnosis & Fix (Dev Only)
> Owner asked for two things in one session:
>
> 1. Make `https://dev.arrflix.s8n.ru` a complete behavioural mirror of prod
> `https://arrflix.s8n.ru` so the dev box is a faithful test bench.
> 2. With dev mirroring prod, definitively diagnose and fix the long-standing
> "Settings entry still appears in the drawer for non-admin users" issue —
> **on dev only**. Owner reviews dev visually before any prod swap.
>
> Date: 2026-05-08. Live verification in `/tmp/arrflix-headless/` (screenshots,
> drawer DOM dumps, selector tests). Prod was **not** modified. The shared
> `web-overrides/index.html` bind-mounted into the prod container was **not**
> edited. Dev now bind-mounts a separate `index-dev.html` of its own.
---
## TL;DR
| Surface | Mirrored to dev? | Method |
|---|---|---|
| Branding (`LoginDisclaimer`, `CustomCss`, `SplashscreenEnabled`) | YES — byte-equal | `GET /System/Configuration/branding` on prod, `POST` on dev |
| `web-overrides/index.html` shim+splash+favicon | YES (initially the shared file; now dev-only `index-dev.html`) | docker-compose bind-mount |
| Libraries (`Movies`, `TV Shows`) | YES — same paths, same `LibraryOptions` | `POST /Library/VirtualFolders` per lib |
| Non-admin users (5, aloy, guest, house, marco, pet) | YES — recreated as `<u>-mirror` with placeholder `dev-test-<u>` passwords | `bin/add-jellyfin-user.sh` |
| `DisplayPreferences` (`client=emby`) per user | YES — copied verbatim from prod | `GET → POST /DisplayPreferences/usersettings` |
| Library scan (item counts within tolerance) | YES — dev 173 ep / prod 168 ep (Mando importing) | `POST /Library/Refresh` |
**Settings drawer leak — root cause:** The drawer Settings entry is rendered as
```html
<a is="emby-linkbutton"
class="navMenuOption lnkMediaFolder btnSettings emby-button"
data-itemid="settings"
href="#">
<span class="material-icons navMenuOptionIcon settings"></span>
<span class="navMenuOptionText">Settings</span>
</a>
```
The `href` is literally `#`. The actual route is wired by a JS click handler
keyed off `data-itemid="settings"`. Every existing CSS rule we had —
`a[href*="mypreferencesmenu"]`, `[to*="mypreferencesmenu"]`,
`[href$="mypreferencesmenu.html"]`, `[to="/mypreferencesmenu.html"]` — matched
**zero** elements in the live DOM (verified via headless probe).
**Fix (dev only, in `index-dev.html`):**
- CSS: `a.btnSettings, .navMenuOption.btnSettings, [data-itemid="settings"] { display: none !important; }`
- JS shim `nukeSettings()` extended to also match `a.btnSettings` and `[data-itemid="settings"]`, with the legacy `mypreferencesmenu` selectors kept as fallback.
---
## Phase 1 — Mirror procedure
### 1.1 Complete dev's first-run wizard
Dev was a fresh container (`StartupWizardCompleted=false`). Three calls:
```bash
DEV=https://dev.arrflix.s8n.ru
curl -ks -X POST "$DEV/Startup/Configuration" \
-H 'Content-Type: application/json' \
-d '{"UICulture":"en-US","MetadataCountryCode":"US","PreferredMetadataLanguage":"en"}'
# Gotcha: POSTing a NEW name to /Startup/User raises
# System.InvalidOperationException: Sequence contains no elements
# because the wizard already auto-created a placeholder admin "MyJellyfinUser"
# on first request. So set the password on the existing name first:
curl -ks -X POST "$DEV/Startup/User" \
-H 'Content-Type: application/json' \
-d '{"Name":"MyJellyfinUser","Password":"2001dude"}'
curl -ks -X POST "$DEV/Startup/RemoteAccess" \
-H 'Content-Type: application/json' \
-d '{"EnableRemoteAccess":true,"EnableAutomaticPortMapping":false}'
curl -ks -X POST "$DEV/Startup/Complete"
```
Then authenticate, save the token, and rename the admin:
```bash
DEV_TOKEN=$(curl -ks -X POST "$DEV/Users/AuthenticateByName" \
-H 'Content-Type: application/json' \
-H 'Authorization: MediaBrowser Client="setup", Device="setup", DeviceId="setup", Version="1.0"' \
-d '{"Username":"MyJellyfinUser","Pw":"2001dude"}' \
| python3 -c 'import json,sys; print(json.load(sys.stdin)["AccessToken"])')
# Rename: GET full user object, mutate Name, POST back to /Users/{id}
DEV_USER_ID=...
curl -ks "$DEV/Users/$DEV_USER_ID" -H "Authorization: MediaBrowser Token=\"$DEV_TOKEN\"" \
| python3 -c 'import json,sys; u=json.load(sys.stdin); u["Name"]="s8n-dev"; print(json.dumps(u))' \
| curl -ks -X POST "$DEV/Users/$DEV_USER_ID" \
-H "Authorization: MediaBrowser Token=\"$DEV_TOKEN\"" \
-H 'Content-Type: application/json' --data-binary @-
```
### 1.2 Mirror branding
```bash
PROD=https://arrflix.s8n.ru
PROD_TOKEN=...
curl -ks "$PROD/System/Configuration/branding" \
-H "Authorization: MediaBrowser Token=\"$PROD_TOKEN\"" > /tmp/prod-branding.json
curl -ks -X POST "$DEV/System/Configuration/branding" \
-H "Authorization: MediaBrowser Token=\"$DEV_TOKEN\"" \
-H 'Content-Type: application/json' \
--data-binary @/tmp/prod-branding.json
```
Verified `LoginDisclaimer`, `CustomCss` (25985 chars), `SplashscreenEnabled=true`
all byte-equal between dev and prod after POST.
### 1.3 Mirror web-overrides bind-mount
Initial mirror used the **shared** prod file:
```yaml
# /opt/docker/jellyfin-dev/docker-compose.yml — initial mirror state
- /opt/docker/jellyfin/web-overrides/index.html:/jellyfin/jellyfin-web/index.html:ro
```
`docker compose up -d --force-recreate jellyfin-dev`. Confirmed dev served
`<title>ARRFLIX</title>`, `<meta name="application-name" content="ARRFLIX">`,
embedded data-URL apple-touch-icon (ARRFLIX), and the `/* ARRFLIX-SHIM-BEGIN */`
script block.
**Then for Phase 2 fix-isolation**, the mount was switched to a dev-only file
copy so dev fixes don't bleed into prod:
```yaml
# /opt/docker/jellyfin-dev/docker-compose.yml — final dev state
- /opt/docker/jellyfin-dev/web-overrides/index-dev.html:/jellyfin/jellyfin-web/index.html:ro
```
`/opt/docker/jellyfin-dev/web-overrides/index-dev.html` was created by `cp`
from the prod shared file, then patched with the V2 fix described in Phase 2.
### 1.4 Mirror libraries
```bash
curl -ks "$PROD/Library/VirtualFolders" -H "Authorization: MediaBrowser Token=\"$PROD_TOKEN\"" \
> /tmp/prod-libs.json
# For each lib: POST /Library/VirtualFolders?name=...&collectionType=...&paths=...&refreshLibrary=false
# with body {"LibraryOptions": <prod LibraryOptions>}
# (script in conversation log; reproducible via python3 driver.)
```
Result: dev has `Movies → /media/movies` and `TV Shows → /media/tv` with the
same `LibraryOptions` (`PreferredMetadataLanguage=en`, `MetadataCountryCode=US`,
`EnableInternetProviders=false`, `SubtitleDownloadLanguages=[eng]`,
`TheMovieDb` as sole metadata fetcher, etc.).
### 1.5 Mirror users
For each non-admin prod user (5, aloy, guest, house, marco, pet) the
existing `bin/add-jellyfin-user.sh` wrapper was reused with placeholder
passwords:
```bash
export JELLYFIN_URL=https://dev.arrflix.s8n.ru
export JELLYFIN_TOKEN=$DEV_TOKEN
for u in 5 guest house marco pet aloy; do
bash bin/add-jellyfin-user.sh "$u-mirror" "dev-test-$u"
done
```
The `-mirror` suffix avoids any confusion with prod accounts. Owner can rotate
or rename later.
### 1.6 Mirror DisplayPreferences
`bin/add-jellyfin-user.sh` already applies the canonical home layout, BUT to
get full parity for any owner-customised layouts (marco's home in particular)
the prod prefs were copied verbatim:
```bash
for u in 5 aloy guest house marco pet; do
curl -ks "$PROD/DisplayPreferences/usersettings?userId=<prod-id>&client=emby" \
-H "Authorization: MediaBrowser Token=\"$PROD_TOKEN\"" \
| curl -ks -X POST \
"$DEV/DisplayPreferences/usersettings?userId=<dev-id>&client=emby" \
-H "Authorization: MediaBrowser Token=\"$DEV_TOKEN\"" \
-H 'Content-Type: application/json' --data-binary @-
done
```
All 6 returned HTTP 204.
### 1.7 Library scan + parity check
```bash
curl -ks -X POST "$DEV/Library/Refresh" -H "Authorization: MediaBrowser Token=\"$DEV_TOKEN\""
```
Within 5 seconds:
| | MovieCount | SeriesCount | EpisodeCount |
|---|---|---|---|
| Prod | 2 | 6 | 168 |
| Dev | 2 | 6 | 173 |
Dev caught up to prod within tolerance. Episode delta of +5 likely reflects
slightly different scrape ordering / Mando still importing on prod-side; well
within the ±20 tolerance.
---
## Phase 2 — Diagnosis (headless Chrome)
### 2.1 Setup
`chromium`/`chromedriver` not installed via dnf — instead used the existing
playwright cache at `~/.cache/ms-playwright/chromium-1217`:
```bash
python3 -m venv /tmp/arrflix-venv
/tmp/arrflix-venv/bin/pip install -q playwright
# probe.py + verify_fix2.py + verify_native.py — see /tmp/arrflix-headless/
```
Login page selectors discovered:
- username: `#txtManualName` (NOT `input[name="username"]`)
- password: `#txtManualPassword`
Drawer button: `.mainDrawerButton`.
### 2.2 Drawer DOM (the smoking gun)
`/tmp/arrflix-headless/drawer-dom.html` (full):
```html
<div class="mainDrawer transition touch-menu-la drawer-open" style="...">
<div class="mainDrawer-scrollContainer scrollContainer focuscontainer-y scrollY">
<div style="height:.5em;"></div>
<a is="emby-linkbutton" class="navMenuOption lnkMediaFolder emby-button" href="#/home.html">
<span class="material-icons navMenuOptionIcon home"></span>
<span class="navMenuOptionText">Home</span>
</a>
<div class="customMenuOptions"></div>
<div class="libraryMenuOptions">
<h3 class="sidebarHeader">Media</h3>
<a is="emby-linkbutton" data-itemid="f137a2dd21bbc1b99aa5c0f6bf02a805"
class="lnkMediaFolder navMenuOption emby-button"
href="#/movies.html?topParentId=...">
<span class="material-icons navMenuOptionIcon movie"></span>
<span class="sectionName navMenuOptionText">Movies</span>
</a>
<a is="emby-linkbutton" data-itemid="767bffe4f11c93ef34b805451a696a4e"
class="lnkMediaFolder navMenuOption emby-button"
href="#/tv.html?topParentId=...">
<span class="material-icons navMenuOptionIcon tv"></span>
<span class="sectionName navMenuOptionText">TV Shows</span>
</a>
</div>
<div class="userMenuOptions">
<h3 class="sidebarHeader">User</h3>
<a is="emby-linkbutton" class="navMenuOption lnkMediaFolder btnSettings emby-button"
data-itemid="settings" href="#"> <!-- ← the leak -->
<span class="material-icons navMenuOptionIcon settings"></span>
<span class="navMenuOptionText">Settings</span>
</a>
<a is="emby-linkbutton" class="navMenuOption lnkMediaFolder btnLogout emby-button"
data-itemid="logout" href="#">
<span class="material-icons navMenuOptionIcon exit_to_app"></span>
<span class="navMenuOptionText">Sign Out</span>
</a>
</div>
</div>
</div>
```
Key observations:
- **Settings `<a>` `href="#"`** — pure dummy hash, no `mypreferencesmenu` substring anywhere.
- **Stable identifiers:** `class="... btnSettings ..."` and `data-itemid="settings"`.
- **Section header `<h3>User</h3>`** is rendered as a plain element. After hiding
Settings, only Sign Out remains under it; the "User" header itself stays
(not orphaned, since Sign Out keeps the section meaningful). Owner can
decide whether to also drop the header in a later iteration.
### 2.3 Why every prior CSS rule failed
Headless evaluation of each candidate selector against the live drawer:
| Selector | Match count |
|---|---|
| `a[href*="mypreferencesmenu"]` | **0** |
| `li:has(> a[href*="mypreferencesmenu"])` | **0** |
| `.MuiListItem-root:has(a[href*="mypreferencesmenu"])` | **0** |
| `[to="/mypreferencesmenu.html"]` | **0** |
| `a[href*="mypreferences"]` | **0** |
| `a[href$="mypreferencesmenu.html"]` | **0** |
| `a[href="#/mypreferencesmenu.html"]` | **0** |
| `.navMenuOption[href*="mypreferencesmenu"]` | **0** |
| `div:has(> a[href*="mypreferencesmenu"])` | **0** |
All 9 prior selectors target **zero** DOM nodes. The shim's
`nukeSettings()` MutationObserver was firing 1×/sec but matching nothing.
This explains why CSS-only and JS-only attempts both kept failing.
### 2.4 The selector that works
```css
a.btnSettings,
.navMenuOption.btnSettings,
[data-itemid="settings"] {
display: none !important;
}
```
Headless before/after:
| | display | height |
|---|---|---|
| Before injection | `flex` | 47.0px |
| After CSS injected | `none` | 0px |
| Sign Out (control) | `flex` | 47.0px (unchanged) |
Screenshots:
- `/tmp/arrflix-headless/v02-drawer-before-fix.png` — drawer shows Home / Media / User → Settings + Sign Out
- `/tmp/arrflix-headless/v03-drawer-after-fix.png` — drawer shows Home / Media / User → Sign Out only
### 2.5 Why `#href` and a JS-routed click
Jellyfin's web bundle uses an `embyRouter` (the legacy Emby app shell) that
dispatches navigation via JS click handlers. For drawer items wired to
internal routes, the bundle either:
1. Sets `href="#/path.html"` (works for plain hash routing — all our Movies/TV
links use this form).
2. Sets `href="#"` and registers a `click` handler keyed by some attribute.
Settings + Sign Out + the user-icon in the header all use form 2.
The canonical attribute keys used in form 2 are:
- `data-itemid="settings"` → opens `Preferences/Display` (or
`Dashboard/General` for admins).
- `data-itemid="logout"` → calls the sign-out handler.
This pattern dates back to the Emby fork and is unlikely to change in 10.x.
---
## Phase 3 — Verification protocol
### 3.1 Native verification (V2 fix in `index-dev.html`, no client injection)
`/tmp/arrflix-headless/verify_native.py` — sign in, open drawer, measure.
```
Native dev (V2 fix in place): {
"settings": { "display": "none", "visibility": "visible", "height": 0, "inline": "none" },
"signOut": { "display": "flex", "visibility": "visible", "height": 47.015625, "inline": "" },
"settingsCount": 1
}
PASS: Settings hidden by index-dev.html out-of-the-box
Final drawer post-nav: [{'display': 'none', 'height': 0}]
```
`settingsCount: 1` confirms the `<a>` is **still in the DOM** (we don't
delete the node — that risks Jellyfin's drawer-renderer rebuilding it on
the next render). The element is present but `display:none` from both the
CSS rule and the JS shim's inline-style override. Sign Out is preserved.
After clicking Home from the drawer and reopening the drawer, the Settings
entry is still hidden (`display: 'none', height: 0`) — confirms the
MutationObserver re-applies on every drawer rebuild.
Screenshots:
- `/tmp/arrflix-headless/native-01-home.png` — post-login home view
- `/tmp/arrflix-headless/native-02-drawer.png` — drawer after V2 fix
(Settings absent)
- `/tmp/arrflix-headless/native-03-home-via-drawer.png` — home reached
via drawer click (still works)
- `/tmp/arrflix-headless/native-04-drawer-post-nav.png` — drawer
reopened after navigation (Settings still hidden)
### 3.2 Manual verification checklist (for owner)
- [ ] Sign in to https://dev.arrflix.s8n.ru as `marco-mirror` / `dev-test-marco`.
- [ ] Click the hamburger top-left.
- [ ] Drawer should show: Home / Media (Movies, TV Shows) / User (Sign Out only).
- [ ] No "Settings" gear icon under the User section.
- [ ] Click Movies, TV Shows, Home — all navigate normally.
- [ ] Reopen drawer after each navigation — Settings should remain absent.
- [ ] Optional regression check: sign in as `s8n-dev` (admin) to confirm
admin still sees Settings — currently this fix hides it for **everyone**
(admins included). If owner wants admin to retain access, see open
question Q1 below.
---
## Recommended swap-to-prod procedure
When owner approves: **merge the `index-dev.html` JS shim block + CSS rule
into `web-overrides/index.html`, then `docker compose restart jellyfin`.**
Concrete diff (to apply to `/opt/docker/jellyfin/web-overrides/index.html`):
1. Inside the inline `<style>` block (above `</style>` near line 16), add:
```css
/* ARRFLIX V2 (2026-05-08) — hide drawer Settings for non-admins.
Drawer Settings link is .btnSettings / [data-itemid="settings"] href="#".
Old href*="mypreferencesmenu" rules never matched. */
a.btnSettings,
.navMenuOption.btnSettings,
[data-itemid="settings"] {
display: none !important;
}
```
2. Inside the `nukeSettings()` function in `ARRFLIX-SHIM-BEGIN`, replace the
selector list:
```js
var nodes = document.querySelectorAll(
'a.btnSettings, [data-itemid="settings"], a[href*="mypreferencesmenu"], [to*="mypreferencesmenu"]'
);
```
The exact patched `index-dev.html` is at
`/opt/docker/jellyfin-dev/web-overrides/index-dev.html` on nullstone — diff
it against `/opt/docker/jellyfin/web-overrides/index.html` to see the two
isolated changes. The `inject-shim.py` script in `bin/` should also be
updated to match (so re-running it doesn't revert the fix).
**No prod changes performed in this session.** Awaiting owner sign-off.
---
## Open questions for owner
**Q1 — Admins too?** Current rule hides Settings for **everyone**, including
admin users. If admin should still reach Settings, options:
(a) keep current rule, admins navigate to `/web/index.html#/dashboard.html`
manually via URL bar (works fine; Settings under-the-hood routes there);
(b) refine rule with a body-class check (`body.lacking-pref-access` —
requires bundle hint that doesn't exist today);
(c) accept the rule and document the workaround.
Recommendation: **(a) — let admins type the URL.** They can also edit the
drawer DOM via dev tools if needed; no real friction. Non-admins are the
threat surface.
**Q2 — User header?** The `<h3>User</h3>` section header remains visible
above the lone "Sign Out" entry. Visually fine but slightly orphan-feeling.
Worth hiding too? If yes:
```css
.userMenuOptions .sidebarHeader { display: none !important; }
```
But this also fires for admins.
**Q3 — Mirror vs prod password parity?** Dev mirror users have placeholder
passwords (`dev-test-<u>`). For better visual fidelity owner may want to
match prod passwords. Not strictly needed for testing the drawer fix.
**Q4 — Dev admin name.** Created as `MyJellyfinUser` then renamed to
`s8n-dev`. Password is the same `2001dude` as prod admin — owner may want
to rotate.
---
## Files referenced
- Live patched dev index: `/opt/docker/jellyfin-dev/web-overrides/index-dev.html` (on nullstone)
- Live dev compose: `/opt/docker/jellyfin-dev/docker-compose.yml` (on nullstone, backups in same folder)
- Headless artifacts: `/tmp/arrflix-headless/` (on onyx)
- `drawer-dom.html` — full drawer DOM dump
- `selector-tests.json` — match counts for every prior selector
- `settings-finds.json` — every Settings-text and href-matching node
- `verify_native.py` — final verification script
- `native-{01..04}-*.png` — final fix screenshots
- `v02-drawer-before-fix.png` / `v03-drawer-after-fix.png` — before/after CSS injection
- Prod-state captures:
- `/tmp/prod-branding.json`
- `/tmp/prod-libs.json`
- `/tmp/prod-counts.json`
- Dev creds env: `/tmp/dev-creds.env` (on onyx — `DEV_TOKEN`, `DEV_USER_ID`)

View file

@ -1,347 +0,0 @@
# 19 - English-Only Lockdown Audit (Read-Only Baseline)
> Owner saw the Play button render as **"Abspielen"** (German). Goal:
> "everything English only, remove the ability to be in another language at
> all". This doc supplements `docs/15-force-english.md` and `docs/16-jellyfin-branding-leaks.md`
> it is the cross-layer baseline for the lockdown branch.
Audited: 2026-05-08 against live `https://arrflix.s8n.ru`, Jellyfin 10.10.3.
Auditor: s8n. Mode: read-only. No POST/PATCH/PUT to Jellyfin, no file
modifications outside this doc.
---
## TL;DR — root cause + why doc 15 didn't close it
1. **Per-user `Configuration.UICulture` is still absent on every account.**
All 8 users return `Configuration.UICulture` as a missing key (verified
live 2026-05-08, see Per-User Table below). Doc 15 correctly identified
the fix and shipped the `bin/force-english-all-users.sh` script — but
**the script was never executed**. There is no audit trail of a
`204 No Content` against `/Users/{id}/Configuration` in the activity log
for any user, and the live state proves it (UICulture still absent on
all 8). When `UICulture` is absent, the SPA falls back to
`navigator.language` / `Accept-Language`, so any browser sending `de-*`
loads the German bundle and renders "Abspielen". This is layer (5) in
the table below.
2. **The German translation bundle is shipped and live.** `de-json.1afccc006ab8bb6c5953.chunk.js`
is reachable, returns HTTP 200, and contains `"Play":"Abspielen"`,
`"Settings":"Einstellungen"`, `"Save":"Speichern"`, etc. — 1963 unique
translated keys. 92 other locale chunks ship alongside it. Until those
are removed from the served bundle, the SPA can always select a
non-English locale even if every user has `UICulture=en-US` (e.g. a new
user who never authenticated, or a tampered SPA). Doc 15 explicitly
noted "no server flag forces SPA to ignore Accept-Language" but stopped
at the per-user pin — it didn't propose deleting the bundles.
In two sentences: **The Play button renders "Abspielen" because every user
has `Configuration.UICulture` absent so the SPA defers to the browser's
`Accept-Language: de-*`, and `bin/force-english-all-users.sh` (the doc-15
fix) was authored but never run. Even after running it, 92 non-English
locale chunks remain reachable on the bind-mounted web bundle, leaving
pre-auth and edge-case surfaces still German-capable.**
---
## Per-Layer Findings
| # | Layer | Current Value | Desired | How to Fix | Owner |
|---|---|---|---|---|---|
| 1 | Server `/System/Configuration.UICulture` | `en-US` | `en-US` | Already correct (admin dashboard locale; does NOT cascade to users — see doc 15 §3) | server (none — already correct) |
| 2 | Server `/System/Configuration.PreferredMetadataLanguage` | `en` | `en` | Already correct | server (none) |
| 3 | Server `/System/Configuration.MetadataCountryCode` | `US` | `US` | Already correct | server (none) |
| 4 | Server `/Branding/Configuration.LoginDisclaimer` | "Welcome to ARRFLIX - Private invite only service" | English already | OK | server (none) |
| 5 | **Per-user `Configuration.UICulture` (8/8 absent)** | **all absent** | `en-US` on every user | **Run `bin/force-english-all-users.sh`** with admin token; idempotent. Endpoint: `POST /Users/{userId}/Configuration` with full Configuration block + `UICulture:"en-US"`. | **server agent — primary fix** |
| 6 | Per-user `Configuration.AudioLanguagePreference` (8/8) | `eng` | `eng` | Already correct | server (none) |
| 7 | Per-user `Configuration.SubtitleLanguagePreference` (8/8) | `eng` | `eng` | Already correct | server (none) |
| 8 | Per-user `Configuration.PlayDefaultAudioTrack` (8/8) | `true` | `true` | Already correct | server (none) |
| 9 | Per-user `Configuration.SubtitleMode` (8/8) | `Default` | `Default` | Already correct | server (none) |
| 10 | Per-user `DisplayPreferences.CustomPrefs.language` | (key not present for any user) | (still not present) | Confirmed read-only of all 8 users via `GET /DisplayPreferences/usersettings?userId=...&client=emby` — no `language` key in `CustomPrefs`. Locale is NOT stored here. Layer is non-issue. | none |
| 11 | Plugin-shipped UI strings | 6 plugins (AudioDB, MusicBrainz, OMDb, Open Subtitles, Studio Images, TMDb); none ship locale UI strings | None | No action — these are metadata-source plugins, not UI string sources. | none |
| 12 | Available `/Localization/Cultures` | 191 | 191 (cosmetic — admin-only) | API returns the full ISO list regardless of disk content. Cannot be trimmed via API. Admin-only. Defer. | docs (no action) |
| 13 | Available `/Localization/Options` (display lang) | 71 | 1 (en-US only, ideally) | Same as 12 — API list is hardcoded in Jellyfin. Cannot be trimmed via API. **But the user-facing dropdown that uses this list is on `mypreferencesmenu.html` which is already hidden by the inject-shim.** Non-issue for non-admins; admin keeps full list. | none — already gated by shim |
| 14 | Available `/Localization/Countries` | 139 | 139 | Cosmetic; admin-only. No action. | none |
| 15 | SPA `index.html` HTML response | identical for `Accept-Language: de-DE` and `en-US` | identical | Confirmed: `curl -H 'Accept-Language: de-*'` and `en-US` return byte-identical 59757-byte HTML. **Locale selection happens client-side in JS**, not server-side. So there is no server header rewrite to add. | web (none) |
| 16 | **Web bundle locale chunks `<lang>-json.<hash>.chunk.js`** | **93 locale chunks served (de, fr, es, ru, zh-cn, ja, ko, ...)** including `de-json.1afccc006ab8bb6c5953.chunk.js` containing `"Play":"Abspielen"` | only `en-us-json.<hash>.chunk.js` reachable; all others 404 | **Override 92 non-English chunks** to empty/redirect at the bind-mount layer (see "Files to Delete" §). Compose pattern: bind-mount each as `:/jellyfin/jellyfin-web/<lang>-json.<hash>.chunk.js:ro` from a 1-byte `{}` stub. Drawback: chunk hashes rotate on JF upgrade — record filenames in `web-overrides/README.md` and re-pin after each image bump. **Cleaner alternative:** add a Traefik middleware `regexReplaceHeaders` rule that 404s any `*-json.*.chunk.js` whose lang prefix isn't `en-us`. | **web agent — secondary fix (defense in depth)** |
| 17 | **PWA manifest `lang`** | `"lang": "en-US"` in `fd4301fdc170fd202474.json` | `"lang": "en-US"` (and `name`/`short_name` rebranded — see doc 16 F1) | manifest `lang` is already correct, but `name`/`short_name` are still `Jellyfin`. Folded into doc 16 F1, not duplicated here. | web (doc 16 work) |
| 18 | Pre-auth splash bundle strings | reads `navigator.language` before any user is authed | en-US only | Doc 15 §"What CANNOT be forced" §1 noted this is unfixable without a runtime shim that overrides `navigator.language`. **NEW PROPOSAL:** patch `bin/inject-shim.py` to inject `Object.defineProperty(navigator, 'language', { value: 'en-US' }); Object.defineProperty(navigator, 'languages', { value: ['en-US'] });` BEFORE any other JS executes. The inject-shim runs in `<head>` before bundles load, so this is the right vehicle. | **web agent — closes pre-auth leak** |
| 19 | Reverse-proxy `Accept-Language` | passed through unchanged (Traefik) | rewrite to `en-US` | doc 15 §"What CANNOT be forced" §2 already evaluated and rejected this as too aggressive for the multi-tenant Traefik. **Re-evaluation:** ARRFLIX is the only consumer of arrflix.s8n.ru via this Traefik router; rewriting Accept-Language at the router level is safe and would mean (5) and (16) and (18) are all redundant defense-in-depth. Add a `traefik.http.middlewares.arrflix-lang.headers.customrequestheaders.Accept-Language=en-US,en;q=0.9` middleware. | **web agent — alternative single-layer fix** |
| 20 | New-user creation script `bin/add-jellyfin-user.sh` | does NOT set `UICulture` | sets `UICulture="en-US"` | doc 15 already documented the one-line patch in step `[3/4]`. Apply the diff. | server agent (doc 15 work) |
---
## Per-User Table (live state, 2026-05-08)
| User | UserId | UICulture | Audio Pref | Subtitle Pref | needs-update |
|---|---|---|---|---|---|
| 5 | `571decc67cdc4ea683b4c936b0a31ff8` | **absent** | eng | eng | **Y** |
| 64bitpotato | `106e347364a643fda324a7a1de3422f6` | **absent** | eng | eng | **Y** |
| aloy | `5447c6246a704533a149910155d5422e` | **absent** | eng | eng | **Y** |
| guest | `82dd8542915740c8ae799b6723542c63` | **absent** | eng | eng | **Y** |
| house | `a4cbcdf95bb34888885af6fbf5c340d1` | **absent** | eng | eng | **Y** |
| marco | `d787fbfc373a44119a247e7406b2721e` | **absent** | eng | eng | **Y** |
| pet | `d60e249518264357a6072a08829d43ec` | **absent** | eng | eng | **Y** |
| s8n (admin) | `2be0f0d3fe3a45dc9298138a15a01925` | **absent** | eng | eng | **Y** |
**Count needing update: 8 of 8 users.** This is the entire active user
roster. Doc 15 (2026-05-08) listed only 5 users (`5`, `guest`, `house`,
`marco`, `s8n`); the roster has since grown to 8 (added: `64bitpotato`,
`aloy`, `pet`). All 3 new users were created via `bin/add-jellyfin-user.sh`
**without** the doc-15 wrapper patch (UICulture line not added), so they
also inherit the bug.
---
## Remediation Checklist (concrete endpoints/bodies for sibling agents)
> Do not execute from this audit doc. Sibling agents own implementation.
### Server agent — primary fix (closes layer 5, single biggest impact)
```bash
# All 8 users in one go (idempotent):
JELLYFIN_TOKEN='${JELLYFIN_API_TOKEN}' bin/force-english-all-users.sh
# Spot-verify one user post-fix (expect "en-US"):
curl -ks https://arrflix.s8n.ru/Users/d787fbfc373a44119a247e7406b2721e \
-H "Authorization: MediaBrowser Token=${JELLYFIN_API_TOKEN}" \
| jq -r '.Configuration.UICulture'
```
After this lands, every authenticated session is pinned to en-US
regardless of browser. Pre-auth and chunk-bundle leaks (16, 18) remain.
### Server agent — wrapper patch (closes layer 20, prevents regression)
Apply the doc-15 §"Wrapper update for future users" one-line patch to
`bin/add-jellyfin-user.sh` step `[3/4]`:
```python
c['UICulture'] = 'en-US' # NEW: pin UI to English regardless of browser Accept-Language
```
### Web agent — defense-in-depth chunk lockdown (closes layer 16)
Two paths, pick one:
**Path A — Traefik middleware (preferred, single point of control):**
```yaml
# In docker-compose.yml jellyfin labels:
- "traefik.http.routers.jellyfin.middlewares=arrflix-lang"
- "traefik.http.middlewares.arrflix-lang.headers.customrequestheaders.Accept-Language=en-US,en;q=0.9"
```
Pros: one line, no bind-mounts to maintain, immune to JF upgrades.
Cons: doesn't help with chunk filename leak if the bundle ever fingerprints
on something other than Accept-Language.
**Path B — chunk bind-mount stubs (heavy but airtight):**
For each non-English chunk in `web-overrides/README.md` (record list per
JF image upgrade), bind a 1-byte `{}` stub:
```yaml
- /opt/docker/jellyfin/web-overrides/empty-chunk.js:/jellyfin/jellyfin-web/de-json.1afccc006ab8bb6c5953.chunk.js:ro
- /opt/docker/jellyfin/web-overrides/empty-chunk.js:/jellyfin/jellyfin-web/fr-json.<hash>.chunk.js:ro
... (×91 more)
```
Where `empty-chunk.js` contents:
```js
(self.webpackChunk=self.webpackChunk||[]).push([[XXXXX],{}]);
```
(XXXXX = chunk-id from runtime.bundle.js for that locale; chunk-ids
listed in §"Files to Delete" below.)
Recommended: ship Path A first as the cheap belt; defer Path B to Phase 2
unless the owner specifically wants the chunk files unreachable.
### Web agent — pre-auth splash fix (closes layer 18)
Append to the IIFE in `bin/inject-shim.py`, before the `start()` block:
```js
// Override navigator.language BEFORE webpack bundles read it
try {
Object.defineProperty(navigator, 'language', {
value: 'en-US', configurable: false, writable: false
});
Object.defineProperty(navigator, 'languages', {
value: ['en-US', 'en'], configurable: false, writable: false
});
} catch(e){}
```
Combined with Path A above (Accept-Language rewrite at proxy), pre-auth
splash strings render in English on first paint.
### Docs agent — supersedes notes
After the above lands, update doc 15 with a "Status: applied 2026-05-XX"
header and link forward to this doc. Update doc 16 F1 cross-ref since the
manifest `name`/`short_name` work overlaps with the lockdown branch.
---
## Files to Delete (locale bundles served by web SPA)
> 92 non-English locale chunks served from `/jellyfin/jellyfin-web/`.
> Hashes were captured from the live `runtime.bundle.js` chunk-id-to-hash
> map on 2026-05-08; **these will rotate on every JF image upgrade**
> regenerate this list before each upgrade with:
>
> ```bash
> curl -ks 'https://arrflix.s8n.ru/web/runtime.bundle.js?<query>' | python3 -c "
> import re, sys
> txt = sys.stdin.read()
> hashmap = dict(re.findall(r'(\d+):\"([a-f0-9]{20})\"', txt))
> namemap = dict(re.findall(r'(\d+):\"([a-zA-Z0-9_-]+-json)\"', txt))
> for cid, name in sorted(namemap.items(), key=lambda x: x[1]):
> if not name.startswith('en-us'):
> print(f'{name}.{hashmap[cid]}.chunk.js')
> "
> ```
The single chunk to **keep** is `en-us-json.667484b4a441712c7e05.chunk.js`.
The 92 chunks to **drop** (current hashes — re-extract on upgrade):
```
af-json.c51579ebcde4cc473828.chunk.js
ar-json.1e4d5a6f9a6acf5777ba.chunk.js
as-json.c9ec5dcf74b613f34865.chunk.js
be-by-json.04e26c1f665c26cef640.chunk.js
bg-bg-json.8f63ff103b1093a4367b.chunk.js
bn-json.<hash>.chunk.js
bn_BD-json.<hash>.chunk.js
ca-json.<hash>.chunk.js
ch-json.<hash>.chunk.js
cs-json.<hash>.chunk.js
cy-json.<hash>.chunk.js
da-json.<hash>.chunk.js
de-json.1afccc006ab8bb6c5953.chunk.js ← THE ONE THAT BIT US (contains "Play":"Abspielen")
el-json.<hash>.chunk.js
en-gb-json.<hash>.chunk.js ← keep? en-GB is also English; defer to owner. If owner wants only en-US, drop.
eo-json.<hash>.chunk.js
es-ar-json.<hash>.chunk.js
es-json.<hash>.chunk.js
es-mx-json.<hash>.chunk.js
es_419-json.<hash>.chunk.js
es_DO-json.<hash>.chunk.js
et-json.<hash>.chunk.js
eu-json.<hash>.chunk.js
fa-json.<hash>.chunk.js
fi-json.<hash>.chunk.js
fil-json.<hash>.chunk.js
fo-json.<hash>.chunk.js
fr-ca-json.<hash>.chunk.js
fr-json.<hash>.chunk.js
ga-json.<hash>.chunk.js
gl-json.<hash>.chunk.js
gsw-json.<hash>.chunk.js
gu-json.<hash>.chunk.js
he-json.<hash>.chunk.js
hi-in-json.<hash>.chunk.js
hr-json.<hash>.chunk.js
hu-json.<hash>.chunk.js
hy-json.<hash>.chunk.js
id-json.<hash>.chunk.js
is-is-json.<hash>.chunk.js
it-json.<hash>.chunk.js
ja-json.<hash>.chunk.js
jbo-json.<hash>.chunk.js
ka-json.<hash>.chunk.js
kab-json.<hash>.chunk.js
kk-json.<hash>.chunk.js
kn-json.<hash>.chunk.js
ko-json.<hash>.chunk.js
lt-lt-json.<hash>.chunk.js
lv-json.<hash>.chunk.js
mk-json.<hash>.chunk.js
ml-json.<hash>.chunk.js
mn-mn-json.<hash>.chunk.js
mr-json.<hash>.chunk.js
ms-json.<hash>.chunk.js
nb-json.<hash>.chunk.js
ne-json.<hash>.chunk.js
nl-json.<hash>.chunk.js
nn-json.<hash>.chunk.js
pa-json.<hash>.chunk.js
pl-json.<hash>.chunk.js
pr-json.<hash>.chunk.js
pt-br-json.<hash>.chunk.js
pt-json.<hash>.chunk.js
pt-pt-json.<hash>.chunk.js
ro-json.<hash>.chunk.js
ru-json.<hash>.chunk.js
si-json.<hash>.chunk.js
sk-json.<hash>.chunk.js
sl-si-json.<hash>.chunk.js
so-json.<hash>.chunk.js
sq-json.<hash>.chunk.js
sr-json.<hash>.chunk.js
sv-json.<hash>.chunk.js
sw-json.<hash>.chunk.js
ta-json.<hash>.chunk.js
te-json.<hash>.chunk.js
th-json.<hash>.chunk.js
tr-json.<hash>.chunk.js
uk-json.<hash>.chunk.js
ur_PK-json.<hash>.chunk.js
uz-json.<hash>.chunk.js
vi-json.5ce142c3b4228beafe7a.chunk.js
zh-cn-json.9ef4c0ef42cc04d64912.chunk.js
zh-hk-json.faa0648f6b0f186e6c07.chunk.js
zh-tw-json.d07cd62eb7dd68687b64.chunk.js
zu-json.0c869775f5145121570c.chunk.js
... (full 92-line list saved to web-overrides/README.md when web agent regenerates)
```
The full count is 93 chunk files served at runtime; one (`en-us-json.<hash>`)
is kept, 92 are dropped. Decision required from owner: drop `en-gb-json`
too, or accept en-GB as a tolerable secondary English locale? Doc 15 line 19
mentioned `TARGET_LOCALE=en-GB` is an alternate option, suggesting en-GB is
not categorically rejected. **Default recommendation: drop en-gb too —
"English only, en-US canonical".**
---
## Cross-References
- `docs/15-force-english.md` — original per-user UICulture diagnosis +
`bin/force-english-all-users.sh` (script exists, **not yet run**) +
wrapper patch for `bin/add-jellyfin-user.sh` (**not yet applied**).
This audit confirms doc 15's diagnosis is still accurate and adds the
user-count update (5 → 8).
- `docs/16-jellyfin-branding-leaks.md` — covers the Jellyfin word in PWA
manifest `name`/`short_name` (F1), screensaver banner (F2), i18n keys
containing "Jellyfin" in en-us-json chunk (F3). The PWA manifest `lang`
field is already `en-US` so no action overlap; only the `name`/`short_name`
work overlaps with this doc's branding-vs-locale axis. F3's DOM text
rewrite shim is orthogonal — it strips the *word* Jellyfin from
English strings, while this doc strips *non-English strings entirely*.
- `docs/10-spa-runtime-shim.md` — vehicle for the proposed
`Object.defineProperty(navigator, 'language', …)` snippet (see Layer 18).
Same `inject-shim.py` already in use; one new `try/catch` block.
- `docs/04-theming-and-users.md` — CustomCss is unrelated to locale; no
overlap, no action.
---
## Sign-off
- **Audit run by:** s8n, 2026-05-08, admin token via `X-Emby-Token` header.
- **Mode:** read-only. Zero POST/PATCH/PUT to Jellyfin. Zero file
modifications outside this `docs/19-english-only-audit.md`.
- **Live state:** all 8 users at UICulture-absent (root cause confirmed);
93 locale bundles served (1 keep / 92 drop); SPA index.html serves
byte-identical regardless of `Accept-Language` (locale is client-side);
doc-15 fix exists but unrun; doc-15 wrapper patch unapplied.
- **Recommended next action:** server agent runs `bin/force-english-all-users.sh`
and applies the wrapper patch (closes 80% of the leak in 30 seconds).
Web agent adds the Traefik `Accept-Language` middleware (Path A) and
the `navigator.language` shim (closes the remaining pre-auth leak).
Defer chunk bind-mounts (Path B) to Phase 2.

View file

@ -1,275 +0,0 @@
# 20 - English-Only Lockdown
> Operator doc for the multi-layer English-only lockdown on arrflix.s8n.ru.
> Goal: everything English only, no opt-out, no drift. Server, per-user,
> and web-SPA layers all pinned; idempotent re-apply runner ships in this
> repo so a Jellyfin restart, container recreate, or new-user-out-of-band
> can never quietly reintroduce another locale.
Date: 2026-05-08
Jellyfin version: 10.10.3 (`jellyfin/jellyfin` image)
Live target: `https://arrflix.s8n.ru`
---
## Goal
**Everything English only, no opt-out, no drift.**
Three things this means in practice:
1. No user — admin or non-admin — can flip the UI to a non-English locale,
either through the settings drawer or by deleting their `UICulture` value
and letting `Accept-Language` win.
2. No new user created (via `bin/add-jellyfin-user.sh`, the web admin panel,
or a future API integration) starts in any state other than `en-US`.
3. No server-side default (UI, metadata language, metadata country) drifts
away from English over time, regardless of Jellyfin upgrades, container
recreates, or admin-panel touches.
The earlier first-pass attempt (`docs/15-force-english.md`,
`bin/force-english-all-users.sh`) only covered point (2) for the five
existing users at the time it ran. Points (1) and (3) and the persistence
mechanism are handled here.
Audit baseline for "what each layer looked like before this lockdown" is in
`docs/19-english-only-audit.md`.
---
## Layers covered
The Jellyfin locale story is layered, and **each layer must be pinned
independently** — fixing one does not protect the others. The lockdown
covers all four:
### 1. Server-wide
Three keys in `/System/Configuration` (the JSON returned by
`GET /System/Configuration`):
| Key | Pinned value | What it controls |
|---|---|---|
| `UICulture` | `en-US` | Dashboard / admin UI default. Does NOT propagate to user UI (that's per-user — see layer 2) but is still pinned for consistency and so admin chrome never drifts. |
| `PreferredMetadataLanguage` | `en` | Default language for metadata fetched from TMDB / TVDB / etc. when a library has no per-library override. |
| `MetadataCountryCode` | `US` | Default country code for region-specific metadata (release dates, ratings boards, etc.). |
The runner POSTs these via `/System/Configuration` (full read-modify-write —
Jellyfin replaces the whole config dict).
### 2. Per-user
Four keys in each user's `Configuration` object (the nested object inside
`GET /Users/{id}` JSON):
| Key | Pinned value | What it controls |
|---|---|---|
| `UICulture` | `en-US` | The actual UI language the web SPA renders for this user. **This is what fixes the "Abspielen" Play-button bug from doc 15.** |
| `AudioLanguagePreference` | `eng` | Default audio track selection for playback. |
| `SubtitleLanguagePreference` | `eng` | Default subtitle language for playback. |
| `PlayDefaultAudioTrack` | `true` | Play the file's default audio track when languages match — keeps playback deterministic. |
The runner iterates `GET /Users` and POSTs the merged config to
`/Users/{id}/Configuration` for every account.
### 3. Web SPA (pre-auth + UI affordance)
Pinning per-user `UICulture` only kicks in **after** authentication. Two
extra surfaces are pre-auth or user-controllable:
- **Pre-auth bundle strings** (login form, splash, "Sign In" button). The
SPA picks the bundle based on `navigator.language` before any
authentication. Without intervention, a `de-*` browser sees German
login chrome.
- **User settings drawer language switcher.** Even with `UICulture` pinned,
a user can technically reopen `MyProfile/Display` and pick another
language — the pin protects the default but not the switcher.
Both are handled by the web overrides shipped in
`web-overrides/english-lockdown.{js,css}` (sibling-agent commit, separate
file from this doc):
- **`english-lockdown.js`** — runs at the top of `index.html` before the
bundle initialises. Overrides `navigator.language`, `navigator.languages`,
and pins `localStorage["language"]` to `"en-us"` so the bundle's pre-auth
locale loader picks English regardless of browser headers.
- **`english-lockdown.css`** — hides the language `<select>` in the user
settings drawer (`MyProfile/Display`) so users cannot switch off English
via the UI.
The shim is bind-mounted into the live container the same way the existing
`web-overrides/index.html` is — see `docs/10-spa-runtime-shim.md` for the
mount mechanism, and `docs/19-english-only-audit.md` for the per-surface
inventory the shim covers.
### 4. DNS / `Accept-Language`
Browsers always negotiate locale via the `Accept-Language` HTTP request
header. We deliberately do NOT strip or rewrite it at Traefik (would break
unrelated backends fronted by the same proxy). Instead the server is now
authoritative because:
- `UICulture` is pinned per-user (layer 2), so Jellyfin ignores the header
for any authenticated request.
- `navigator.language` is overridden in the SPA shim (layer 3), so the
pre-auth bundle loader doesn't honor the header either.
Net effect: `Accept-Language: de-DE,de;q=0.9,en` arriving from a browser
gets parsed by Jellyfin / the SPA, but every layer that would have used it
has been pinned to English first.
---
## Re-apply procedure
The runner is **idempotent** — running it on an already-locked-down server
is a no-op (each layer is set to its target value, the script verifies and
moves on). It exists to:
- Re-apply after a Jellyfin upgrade (some upgrades reset metadata defaults).
- Re-apply after container recreate (`docker compose down && up`).
- Re-apply after a new user is created via the admin panel (which doesn't
go through `bin/add-jellyfin-user.sh` and so misses the wrapper's
English defaults).
- Re-apply on a schedule for paranoia / drift detection.
### One-shot run
```bash
export JELLYFIN_API_TOKEN=<admin-token> # required
export JELLYFIN_URL=https://arrflix.s8n.ru # optional, this is the default
bin/english-lockdown-runner.sh
```
Output is a one-line summary per surface: server config block, then one
line per user. Exit code 0 means every layer landed; exit code 1 means at
least one POST failed (script prints which).
### Optional: weekly via systemd timer
If you want automatic re-application (paranoia / catch admin-panel drift),
drop a user-level systemd timer pair. The repo deliberately does not ship
these unit files — it's an operator decision how often to run, and where
the API token comes from on a given host.
```ini
# ~/.config/systemd/user/jellyfin-english-lockdown.service
[Unit]
Description=Re-apply ARRFLIX English-only lockdown
After=network-online.target
[Service]
Type=oneshot
EnvironmentFile=%h/.config/arrflix/lockdown.env
ExecStart=%h/code/ARRFLIX/bin/english-lockdown-runner.sh
```
```ini
# ~/.config/systemd/user/jellyfin-english-lockdown.timer
[Unit]
Description=Weekly ARRFLIX English-only lockdown re-apply
[Timer]
OnCalendar=weekly
Persistent=true
[Install]
WantedBy=timers.target
```
`~/.config/arrflix/lockdown.env` should contain
`JELLYFIN_API_TOKEN=<token>` (chmod 600). Enable with
`systemctl --user enable --now jellyfin-english-lockdown.timer`.
---
## Drift-check procedure
Quick verification — run any time without touching state:
**Server-wide (UICulture / metadata):**
```bash
curl -ks "$JELLYFIN_URL/System/Configuration" \
-H "Authorization: MediaBrowser Token=$JELLYFIN_API_TOKEN" \
| python3 -c "import json,sys; c=json.load(sys.stdin); print({k:c.get(k) for k in ('UICulture','PreferredMetadataLanguage','MetadataCountryCode')})"
# Expect: {'UICulture': 'en-US', 'PreferredMetadataLanguage': 'en', 'MetadataCountryCode': 'US'}
```
**Per-user (every account):**
```bash
curl -ks "$JELLYFIN_URL/Users" \
-H "Authorization: MediaBrowser Token=$JELLYFIN_API_TOKEN" \
| python3 -c "
import json, sys
for u in json.load(sys.stdin):
c = u.get('Configuration', {})
print(f\"{u['Name']:10s} UI={c.get('UICulture','<absent>')} A={c.get('AudioLanguagePreference','<absent>')} S={c.get('SubtitleLanguagePreference','<absent>')}\")
"
# Expect every line: UI=en-US A=eng S=eng
```
**Web SPA shim (live bind-mount):**
```bash
curl -ks https://arrflix.s8n.ru/web/english-lockdown.js | head -1
# Expect: an actual JS line, not 404
```
If any of those checks comes back wrong, run the runner:
`JELLYFIN_API_TOKEN=<token> bin/english-lockdown-runner.sh`.
---
## Known gaps
These are explicitly **not** covered by the lockdown. They are documented
here so future operators know what's still possible-but-deferred:
1. **Jellyfin web bundle locale files.** The web bundle still ships
`de.json`, `fr.json`, `es.json`, etc. inside the immutable Docker image.
Replacing those bundle files with English copies would harden the
pre-auth layer further (no German strings on disk → no German strings
possible) but is **destructive to upstream upgrades**: every
`jellyfin/jellyfin` image rebuild would have to repeat the bundle swap.
Deferred indefinitely; the `navigator.language` override in
`english-lockdown.js` is sufficient for current threat model.
2. **Native mobile clients (Jellyfin Android / iOS apps).** These read
per-user `UICulture` correctly, so the per-user layer protects them.
They do NOT load the web SPA shim, so the pre-auth layer does not
apply (but pre-auth on mobile is just the login form, served from
client-side localized resources Jellyfin ships in the app — not under
our control).
3. **Library-level `PreferredMetadataLanguage` / `MetadataCountryCode`
overrides.** Each library can override the server defaults. The runner
pins **server** defaults only — library overrides set in the admin
panel are preserved. Worth a periodic audit
(`GET /Library/VirtualFolders`) but not part of this lockdown.
4. **Subtitle / track *display* language vs *preference* language.**
`SubtitleLanguagePreference=eng` selects English subs when present.
It does NOT translate non-English subs to English. Out of scope —
that's a media-pipeline concern, not a UI lockdown concern.
---
## Cross-references
- `docs/15-force-english.md` — historical first pass (UICulture per-user
POST mechanism, "Abspielen" Play-button diagnosis). Read for context on
*why* `Configuration.UICulture` is the authoritative knob.
- `docs/16-jellyfin-branding-leaks.md` — related lockdown sweep
(Jellyfin-name and logo redaction). Same pattern: multi-layer pin +
re-apply runner.
- `docs/19-english-only-audit.md` — pre-lockdown baseline. Per-surface
state before the sweep ran.
- `docs/10-spa-runtime-shim.md` — explains the web-overrides bind-mount
mechanism that delivers `english-lockdown.{js,css}` into the live
container.
- `bin/english-lockdown-runner.sh` — idempotent re-apply runner.
Run it any time the server might have drifted.
- `bin/add-jellyfin-user.sh` — wrapper for new user creation; already
bakes in English defaults per `docs/15`.

View file

@ -1,482 +0,0 @@
# 21 — Rick and Morty Color / HDR Audit (Read-Only)
> Status: **read-only audit**, executed 2026-05-08 against
> `https://arrflix.s8n.ru` (Jellyfin 10.10.3 on nullstone). Scope:
> diagnose why **Rick and Morty looks "kind of gray / washed-out"**
> while other titles render normally. **No fixes applied. No state
> mutated. No transcode triggered.**
>
> Inputs: `ffprobe` via `docker exec jellyfin /usr/lib/jellyfin-ffmpeg/ffprobe`
> against on-disk media; Jellyfin REST `/Items/{id}/PlaybackInfo`,
> `/System/Configuration/encoding`, `/Branding/Configuration` (auth
> `X-Emby-Token: ${JELLYFIN_API_TOKEN}`); contrast probe against
> *The Mandalorian* as a known-good SDR title; review of `CustomCss`
> against the inventory in doc 14 §1b.
---
## 1. Executive summary
**Confirmed root cause:** the Rick and Morty release on disk is an
**HDR10 4K HEVC Main 10 (PQ / BT.2020) "AI Upscale"** of an originally
SDR animated show. Jellyfin classifies it as `VideoRange=HDR`
`VideoRangeType=HDR10` and forces the browser onto the **transcode
path** (`TranscodeReasons=ContainerNotSupported, AudioCodecNotSupported,
SubtitleCodecNotSupported` — every browser session triggers this). The
encoding config has **`EnableTonemapping=false` and
`HardwareAccelerationType=none`**, so ffmpeg software-decodes the
HDR10 source, then h264-encodes **without applying a tonemap**, then
the browser interprets the resulting BT.2020 PQ pixel data as plain
BT.709 SDR. That mis-interpretation is the textbook signature of the
washed-out grey look.
**One-line remediation (lowest blast radius):** in
`/System/Configuration/encoding`, set `EnableTonemapping=true` (the
algorithm `bt2390` is already correctly selected) — this enables CPU
tonemap on the existing software pipeline; CSS, hardware, and source
files do not need to change.
CSS / theme is **ruled out** as a cause — `CustomCss` contains zero
`grayscale(`, zero `saturate(`, zero `hue-rotate(` filters.
---
## 2. ffprobe table — Rick and Morty (Season 01)
All probes via `docker exec jellyfin /usr/lib/jellyfin-ffmpeg/ffprobe -v error -select_streams v:0 …`.
| File | Codec | Profile | Pix fmt | color_space | color_transfer | color_primaries | range | W×H | Bitrate | Size | HDR side-data |
|---|---|---|---|---|---|---|---|---|---|---|---|
| S01E01 — Pilot | hevc | Main 10 | yuv420p10le | bt2020nc | **smpte2084** (PQ) | bt2020 | pc | 3840×2160 | 8.13 Mbit/s | 1.34 GB | **none** (no MasteringDisplay / CLL block) |
| S01E05 — Meeseeks and Destroy | hevc | Main 10 | yuv420p10le | bt2020nc | **smpte2084** | bt2020 | pc | 3840×2160 | 7.97 Mbit/s | 1.26 GB | not present |
| S01E08 — Rixty Minutes | hevc | Main 10 | yuv420p10le | bt2020nc | **smpte2084** | (BT.2020) | (pc) | 3840×2160 | n/a | 1.34 GB | not present |
| S01E11 — Ricksy Business | hevc | Main 10 | yuv420p10le | bt2020nc | **smpte2084** | bt2020 | pc | 3840×2160 | 8.86 Mbit/s | 1.49 GB | not present |
**Reading:**
- `color_transfer=smpte2084` (a.k.a. ST 2084 / PQ) is the **HDR10
transfer function**. All R&M S01 episodes ship with HDR10 tagging.
- `color_primaries=bt2020` + `color_space=bt2020nc` are the BT.2020
wide-gamut primaries (the HDR colour space).
- `pix_fmt=yuv420p10le` = 10-bit-per-component, 4:2:0 chroma sub-
sampling. Required for HDR10 content.
- `color_range=pc` = full-range (01023 for 10-bit) rather than the
TV-range (64940) usually expected. **This is unusual** — most HDR10
Blu-ray / streaming sources are TV-range. PC-range mis-interpreted
as TV-range is itself a contrast/saturation hit, layered on top of
the HDR-as-SDR hit.
- **No HDR side-data** (`MasteringDisplayMetadata`,
`ContentLightLevelMetadata`) is present in any episode — the source
declares HDR10 but ships without the static-metadata blocks that a
proper HDR display or tonemapper would consume. This is a
fingerprint of a **fake HDR10** AI upscale (the file's own embedded
title is `"Rick and Morty - S01E01 - Pilot - 2160p HDR Ai Upscale -Mesc"`).
- 4K x 24 fps x ~8 Mbit/s × 1320 s = file size matches container
declaration, no surprises in muxing.
- The poster art / show landing page itself is rendered by the SPA
from JF's image cache (PNG / JPEG, sRGB) — those are not affected by
HDR. Only the **video element** is washed-out.
### 2a. Comparison vs. The Mandalorian (known-good SDR)
| File | Codec | Profile | Pix fmt | color_space | color_transfer | W×H | Bitrate |
|---|---|---|---|---|---|---|---|
| Mandalorian S01E01 | hevc | Main 10 | yuv420p10le | **bt709** | **bt709** | 1920×804 | 6.69 Mbit/s |
| Mandalorian S02E01 | hevc | Main 10 | yuv420p10le | **bt709** | **bt709** | (1920×…) | n/a |
| Mandalorian S03E01 | hevc | Main 10 | yuv420p10le | **bt709** | **bt709** | 1920×804 | 6.72 Mbit/s |
**Reading:** Mandalorian is **plain SDR BT.709** (the same colour space
the browser's `<video>` defaults to assume). 10-bit pixels here are
fine because the *transfer* is BT.709 SDR, not PQ — the browser /
ffmpeg pipeline sees this and renders it correctly. This is the
control sample that proves the difference is *content-side*, not
config-side.
---
## 3. Jellyfin encoding config — relevant fields
Source: `GET /System/Configuration/encoding`.
| Field | Value | Comment |
|---|---|---|
| `HardwareAccelerationType` | `"none"` | **GPU is dead** (host has no nvidia driver — see doc 13 finding 02). Every transcode is software ffmpeg. |
| `EnableHardwareEncoding` | `true` | No-op while `HardwareAccelerationType=none`. |
| `EnableTonemapping` | **`false`** | **THE BUG.** Software-tonemap is disabled. With HDR source + `=none` HW + tonemap off, the output is HDR pixels with no SDR conversion. |
| `EnableVppTonemapping` | `false` | Intel-VPP path, not relevant for CPU. |
| `EnableVideoToolboxTonemapping` | `false` | macOS path, not relevant. |
| `TonemappingAlgorithm` | `"bt2390"` | **Good choice** when enabled — the BT.2390 EETF is the modern recommendation. (`hable` is the legacy fallback; `mobius` and `reinhard` are alternatives.) |
| `TonemappingMode` | `"auto"` | Fine. |
| `TonemappingRange` | `"auto"` | Fine. |
| `TonemappingDesat` | `0` | Default. |
| `TonemappingPeak` | `100` | Target SDR peak nits — default. |
| `TonemappingParam` | `0` | Algorithm-specific; 0 = default. |
| `EnableDecodingColorDepth10Hevc` | `true` | 10-bit HEVC decode permitted. |
| `H264Crf` | `23` | h264 quality target for transcode output (default for JF). |
| `H265Crf` | `28` | h265 quality target (unused — `AllowHevcEncoding=false`). |
| `AllowHevcEncoding` | `false` | Cannot transcode-out as HEVC (forces h264 output). |
| `AllowAv1Encoding` | `false` | Cannot transcode-out as AV1. |
| `EnableThrottling` | `false` | Per doc 13 finding 03 — separate issue. |
| `EnableSegmentDeletion` | `false` | Per doc 13 finding 05 — separate issue. |
| `MaxMuxingQueueSize` | `2048` | Per doc 13 — separate issue. |
| `EncoderAppPathDisplay` | `/usr/lib/jellyfin-ffmpeg/ffmpeg` | Bundled jellyfin-ffmpeg, not host. |
| `VaapiDevice` | `/dev/dri/renderD128` | Empty on host (no Intel iGPU on AMD Ryzen). |
| `EncodingThreadCount` | `-1` | Auto = all cores. |
**Net:** the *one* knob standing between "washed-out grey" and
"correctly tonemapped SDR" is `EnableTonemapping`. The algorithm is
already set correctly (`bt2390`). Flipping the bool to `true` is a
single POST-able field-edit and applies to every future transcode.
### 3a. Live PlaybackInfo for R&M S01E01 (browser DeviceProfile)
Simulated browser PlaybackInfo (DeviceProfile: Chrome, h264 / aac /
mp3 / ac3 / eac3, hls):
```
SupportsDirectPlay: false
SupportsDirectStream: false
SupportsTranscoding: true
TranscodingSubProtocol: hls
TranscodingUrl:
/videos/<id>/master.m3u8
?VideoCodec=h264
&AudioCodec=aac,mp3,ac3,eac3
&VideoBitrate=139616000
&SegmentContainer=ts
&hevc-level=150
&hevc-videobitdepth=10
&hevc-profile=main10
&TranscodeReasons=
ContainerNotSupported,
AudioCodecNotSupported,
SubtitleCodecNotSupported
```
**Reading:** every browser session for R&M is forced into transcode by
three independent reasons (container `mkv`, audio `truehd` / `ac3`,
subtitle `pgs` / `ass` — confirmed by MediaInfo). It's not just an HDR
issue — the file *cannot* direct-play in any browser, so the transcode
path is mandatory, and inside that path tonemap is currently off.
For comparison, an SDR Mandalorian episode would still hit the
transcode path for the same container/audio reasons, but the
tonemap-off flag wouldn't matter because the source is already BT.709.
---
## 4. Theme / CSS rule-out check
Inspected `/Branding/Configuration → CustomCss` (25 225 chars, full
inventory in doc 14 §1b). Searched the live string for any
filter / saturation / hue-rotate / opacity rule that could desaturate
the video element or its container.
| Filter pattern | Matches in CustomCss | Verdict |
|---|---|---|
| `grayscale(` | **0** | ✓ |
| `saturate(` | **0** | ✓ |
| `hue-rotate(` | **0** | ✓ |
| `sepia(` | **0** | ✓ |
| `brightness(` | **0** | ✓ |
| `contrast(` | **0** | ✓ |
| `invert(` | **0** | ✓ |
| `mix-blend-mode` | **0** | ✓ |
| `filter:` | **0** | ✓ |
| `backdrop-filter:` | **0** | ✓ |
| `opacity:` (on `.itemBackdrop` / `video` / `.osdContainer`) | **0** | ✓ |
Also checked the doc 14 §7 detail-page backdrop rules just landed
(`linear-gradient(90deg, rgba(0,0,0,0.95) 0%, …)`) — that gradient is
applied to `.layout-desktop .backgroundContainer.withBackdrop`, NOT to
the `<video>` element. It tints the *backdrop poster behind the
detail-page header*, not playback. **Not the cause.**
`web-overrides/index.html` (the bind-mounted critical-path style): no
`filter:`, no `mix-blend-mode`, no animation that would alter video.
`ARRFLIX-SHIM` JavaScript only touches `document.title`, favicon, and
`mypreferencesmenu` drawer entries — does not touch playback DOM.
**Theme / CSS rule-out: PASS.** The greyness is in the pixel data
delivered to the browser, not in any post-render CSS effect.
---
## 5. Source-file integrity rule-outs
Already visible in §2, but stated explicitly so each candidate root
cause is closed:
| Hypothesis | Evidence | Verdict |
|---|---|---|
| (a) HDR file + CPU tone-map | All R&M S01 = HDR10 (`smpte2084`/`bt2020`). Encoding config `EnableTonemapping=false`, `HardwareAccelerationType=none`. | **CONFIRMED** root cause. |
| (b) CSS filter on theme | §4 shows zero filter/saturation rules. | RULED OUT. |
| (c) Direct-play tag mismatch | PlaybackInfo §3a shows `SupportsDirectPlay=false` — browser is on transcode path, no chance of DP-tag confusion. | RULED OUT. |
| (d) Source is genuinely SDR but graded flat (wrong tags) | ffprobe reports HDR10 tags consistently across 4 episodes, and Jellyfin agrees (`VideoRangeType=HDR10`). Title-string `"2160p HDR Ai Upscale"` confirms intent. | RULED OUT — the source IS HDR10, just badly so. |
| (e) Container / bit-depth / browser HW-decode bit-crush | Browser never receives the 10-bit HEVC because transcode is mandatory; output is 8-bit h264. So no client-side bit-depth issue is possible. | RULED OUT. |
| (f) Missing Mastering Display / CLL metadata makes tonemap target unknown | True — files have no static HDR metadata. Once tonemap is enabled, ffmpeg will fall back to defaults (peak 1000 nits, etc.) which is fine for cartoon AI-upscale content; better than no tonemap. | NOT a blocker for the fix. |
| (g) `color_range=pc` (full-range) | Full-range PC pixels reinterpreted as TV-range = an additional contrast bump. Tonemap filter handles range conversion. | Subsumed by (a) — same fix. |
---
## 6. Concrete remediation list (ranked: effort vs blast-radius)
### #1 — Enable software tonemap (recommended)
**Action:** flip a single bool in encoding config.
```
PUT /System/Configuration/encoding
EnableTonemapping = true
(TonemappingAlgorithm already = "bt2390" — leave as-is)
(TonemappingPeak already = 100 — leave as-is)
(TonemappingMode already = "auto" — leave as-is)
(TonemappingRange already = "auto" — leave as-is)
(TonemappingDesat already = 0 — leave as-is)
```
(Or via UI: *Dashboard → Playback → Transcoding → "Enable
tone-mapping"*.)
**Effect:** every future HDR-source transcode applies BT.2390 EETF +
gamut conversion (BT.2020 → BT.709) before h264 encoding. Output looks
right in any SDR browser.
**Cost:** zero seek time, no restart needed.
**Blast radius:** **low.** Only HDR sources (currently: Rick and
Morty S01) are affected. SDR sources (Mandalorian etc.) already have
BT.709 tags so the tonemap filter is a no-op for them.
**Caveat:** software tonemap on a 4K HEVC source on the existing
host load (doc 13 finding 01: load 11.4, swap 6.8 GiB) will add
~1.52× extra CPU per stream compared to a tonemap-off transcode.
Pair this with **doc 13 finding 03 (`EnableThrottling=true`)** so a
client-cancelled stream stops burning CPU; otherwise a stalled R&M
playback will eat a core for 12 minutes (`SegmentKeepSeconds=720`).
**Risk of "looks worse than expected":** AI-upscale R&M has no real
HDR — the wide-gamut tonemap will give a result that is more saturated
than the original Adult Swim broadcast (cartoon flat colours pushed
through BT.2020 round-trip), but visibly correct relative to current
washed-out grey. If the operator wants the original cartoon look,
remediation #3 below.
### #2 — Pair tonemap-on with throttling-on (doc 13 finding 03)
**Action:** when applying #1, also set:
```
EnableThrottling = true
EnableSegmentDeletion = true
```
**Effect:** caps wasted ffmpeg CPU after a client disconnects — already
recommended in doc 13 audit, doubly important once we add tonemap
overhead.
**Cost:** zero additional. Same UI page as #1.
### #3 — Replace R&M with a properly-graded SDR release (highest fidelity, highest effort)
**Action:** swap the `Rick.and.Morty.S01...2160p.HDR.Ai.Upscale-Mesc`
files for a native SDR encode (e.g., the original Adult Swim
1080p / WEB-DL releases or the 2160p SDR remasters where they exist).
**Effect:** zero tonemap cost (source is already BT.709), faster
transcodes, files shrink ~3-4× (8 Mbit/s 4K HDR → ~2 Mbit/s 1080p
SDR for a 22-min cartoon is plenty), consistent look with rest of
library.
**Cost:** medium — re-acquisition + re-import + re-scan + 90 GB disk
freed on `/home` which is currently 90% full (doc 13 finding 01).
**Blast radius:** medium. Watched-state and metadata stay (Sonarr
will re-match by `(2013)` + episode index), but each episode item ID
in JF will change → existing playback positions on R&M are lost.
### #4 — Pre-transcode R&M S01 to SDR offline (middle-ground)
**Action:** run `ffmpeg` once (outside Jellyfin) with the same
tonemap pipeline, write SDR-tagged HEVC files alongside, swap them in.
```sh
# Per episode (CPU intensive, ~1 hr per 22-min episode on this host):
ffmpeg -i in.mkv \
-map 0:v:0 -map 0:a -map 0:s? \
-vf "zscale=t=linear:npl=100,format=gbrpf32le,zscale=p=bt709,tonemap=bt2390:desat=0,zscale=t=bt709:m=bt709:r=tv,format=yuv420p10le" \
-c:v libx265 -preset slow -crf 22 -profile:v main10 \
-c:a copy -c:s copy out.mkv
```
**Effect:** Jellyfin no longer needs to tonemap on every play —
files are SDR-tagged at rest. CPU at playback drops to a normal
HEVC-software-decode-then-h264-software-encode (still no GPU but
no extra tonemap stage).
**Cost:** ~11 hours wall-clock on the existing 12-core box for
S01's 11 episodes (CPU-only HEVC encode); +20 GB during transcode,
files end ~30% smaller than HDR originals.
**Blast radius:** medium-low. Rewrites only R&M — other library
entries untouched. Item IDs change (same as #3).
### #5 — Wait for GPU restoration, then enable VPP / NVENC tonemap
**Action:** once nvidia driver is back on the host (doc 13
finding 02), set:
```
HardwareAccelerationType = nvenc
EnableTonemapping = true
EnableVppTonemapping = true (if Intel — N/A on Ryzen)
HardwareDecodingCodecs = [hevc, h264, vc1] (add hevc)
```
**Effect:** GPU does the HEVC decode + tonemap + h264 encode. No CPU
load, real-time on 4K. This is the long-term right answer.
**Cost:** L (host driver work). Already on doc 13 fix-list.
**Blast radius:** large but already planned. Until GPU is back, do
remediation #1.
### Recommended order
1. **Apply #1 + #2 today** (single Playback-settings page edit). Cost
~30 s of ops time, immediate visual fix on R&M, no media churn.
2. Re-test R&M playback (see §7).
3. If the tonemapped result still feels "wrong" because R&M is a
cartoon and the AI-upscale's HDR is a fiction, go to **#4 or #3**
for the long-term cure.
4. Park #5 behind the GPU restoration backlog.
---
## 7. Test plan to verify after fix
### 7a. Pre-fix baseline (capture now, before flipping the bool)
1. Open `https://arrflix.s8n.ru/web/#/details?id=324f75b84f394a5d9b0749c0679f23b9`
in Chrome/Firefox on onyx.
2. Hit Play. Pause at ~30 s in.
3. Take a screenshot (full-window). File:
`evidence/21-pre-fix-rm-s01e01-30s.png`.
4. Note the visible characteristics: Rick's lab-coat (should be pure
white but currently looks pale-grey), background green of the
garage, skin tones.
### 7b. Apply remediation #1 + #2
UI path: *Dashboard → Playback → Transcoding*:
- Enable "Tone-mapping"
- Enable "Throttle transcodes"
- Enable "Delete transcode segments"
(Or POST `/System/Configuration/encoding` directly with the three
bools flipped.)
No restart needed — Jellyfin re-reads `encoding.xml` per request.
### 7c. Post-fix verification
1. **New playback session** — close and reopen the browser tab so the
SPA requests a fresh `PlaybackInfo` and a fresh `master.m3u8`
(existing in-flight transcode is locked to the pre-fix ffmpeg
command line). Easiest: hard-reload (`Ctrl-Shift-R`) and re-click
Play.
2. Pause at the same ~30 s mark.
3. Screenshot to `evidence/21-post-fix-rm-s01e01-30s.png`.
4. **Side-by-side compare** the two images. Expectations:
- Whites are noticeably whiter (lab coat, ship hull).
- Saturation is higher (garage greens, sky blues, characters).
- Black-level remains similar (or slightly deeper).
- Skin tones look natural rather than greenish-grey.
### 7d. Server-side sanity checks (5 min after first post-fix play)
```sh
# Confirm tonemap is in the actual ffmpeg command line for this stream
ssh user@192.168.0.100 \
"docker exec jellyfin ps -ef | grep ffmpeg | grep -E 'tonemap|zscale' | head"
# Expected: a process line containing `zscale=...:t=linear:...,tonemap=bt2390,...`
# If the line lacks `tonemap`, the encoding.xml change didn't apply or
# JF has a cached transcode session — bounce the container.
# Confirm HDR-aware filter graph fed only by HDR sources (Mandalorian
# should NOT have tonemap in its ffmpeg cmdline)
ssh user@192.168.0.100 \
"docker exec jellyfin tail -200 /config/log/log_*.log | grep -E 'tonemap|smpte2084'"
```
### 7e. Negative test (other libraries unaffected)
Play one episode of:
- Mandalorian (SDR BT.709) — should look identical pre/post.
- Futurama / American Dad / Obi-Wan — same expectation (probe these
if you want to be thorough; they're outside this audit's scope).
If any of these now look *over*-saturated or *under*-saturated post-fix,
the tonemap is leaking onto SDR sources — open a bug, set
`TonemappingMode` from `auto` to a stricter mode.
### 7f. Performance check (CPU is the operative resource)
While the post-fix R&M episode is playing:
```sh
ssh user@192.168.0.100 "uptime && top -bn1 -p \$(pgrep -f 'ffmpeg.*Rick.and.Morty' | head -1) | tail -5"
```
- Expect ffmpeg to consume ~600900 % CPU (69 cores) on this host
for a 4K HEVC→h264 + tonemap pipeline.
- If load average climbs past 16 sustained or swap usage grows past
baseline 6.8 GiB, escalate doc 13 finding 01 — pair with
remediation #4 (pre-transcode the season) sooner rather than later.
### 7g. Long-term verification
A week after the fix, check:
```sh
# Number of 499 client-cancel events on jellyfin@docker
docker logs traefik --since 168h 2>&1 | grep '"jellyfin@docker"' | grep ' 499 ' | wc -l
```
Should be ≤ pre-fix baseline (currently 2 / hour, doc 13 finding 03).
If it climbs after enabling tonemap (because the tonemap stage
slowed transcodes enough to let the client time out), that's the
trigger to invest in remediation #4 or #5.
---
## 8. What was NOT touched during this audit
- No POST/PUT to `/System/Configuration/encoding`.
- No POST to `/System/Configuration/branding`.
- No `docker exec jellyfin` writes (read-only `ls`, `cat`, `ffprobe`).
- No `docker compose` action, no container restart.
- No file modification on `/home/user/media/`.
- No transcode triggered (PlaybackInfo simulation only — that endpoint
decides codec paths but does not start ffmpeg).
---
## 9. Sign-off
- **Auditor:** s8n (audit pass, 2026-05-08)
- **Live config at audit time:** Jellyfin 10.10.3,
`EnableTonemapping=false`, `HardwareAccelerationType=none`,
`TonemappingAlgorithm=bt2390`. CSS = Cineplex v1.0.6 + ARRFLIX
brand layer (no greyscale filters).
- **Confirmed root cause:** HDR10 source (R&M S01) + CPU-only
pipeline + tonemap disabled = HDR pixels delivered as SDR =
washed-out grey.
- **Recommended fix:** flip `EnableTonemapping=true` (one
Playback-settings checkbox) AND `EnableThrottling=true` +
`EnableSegmentDeletion=true` (pair-finding from doc 13).
- **Next audit due:** **2026-08-08** alongside doc 13's quarterly
rotation, or sooner if a new HDR source lands in another library.

View file

@ -1,517 +0,0 @@
# 22 — Jellyfin Runtime Performance Audit (server scope)
> Status: **read-only audit**, executed 2026-05-08 ~17:3017:45 BST against
> `https://arrflix.s8n.ru` (Jellyfin 10.10.3 on nullstone, container `jellyfin`).
> Scope: server runtime — CPU, RAM, container limits, FFmpeg, scheduled
> tasks, plugins. Network/edge, storage, color/HDR are out of scope (sibling
> agents). Supplements doc 13 (2026-05-08, host-capacity scan); does not
> repeat findings already in 13 unless the data has materially changed.
> **No fixes applied. No state mutated. No container restart.**
---
## 1. Executive summary — top 3 perf culprits
| # | Culprit | Severity | Evidence (one line) |
|---|---|:-:|---|
| 1 | **4 concurrent ffmpeg processes for ONE viewer**, each upscaling 1080p → 2160p with PGS subtitle burn-in, no throttling, no segment deletion | **CRITICAL** | `ps`: PIDs 1681949 (643 % CPU), 1685275 (135 %), 1685316 (133 %), 1685478 (132 %) — all transcoding `Rick and Morty S01E01.mkv`, all `-vf scale=3840:2160` + `[0:4]overlay` subtitle burn. Container CPU 690876 % across 3 samples |
| 2 | **Forgejo BlueBuild CI container running uncapped on the same 12-core host** (noisy neighbor) | **HIGH** | `docker stats`: `FORGEJO-ACTIONS-TASK-202_..._Build-push-OCI` 8899 % CPU, 4.3 GiB RAM, 5 GB net-in. Both jellyfin and the build container have `Memory=0 NanoCpus=0 CpuQuota=0` (no limits). Aggregate load 15.43 / 14.61 / 8.85 on 12 cores |
| 3 | **GPU acceleration still off** (already in doc 13 finding 02; quantified here) — every CPU transcode spawns one ffmpeg burning 68 cores per stream because of the 4K-upscale + sub-overlay filtergraph | **HIGH** | `HardwareAccelerationType=none`. Per-ffmpeg cost on this filtergraph: ~6.4 cores at `preset=veryfast`. 2 viewers transcoding = full host pegged |
**Biggest quick-win:** turn on **transcode throttling + segment deletion**
(doc 13 finding 03 already flags this; new evidence here makes it
non-optional). The 4-stream pile-up in §3 is exactly what those two
flags exist to prevent — without them, every client seek/reload spawns a
fresh ffmpeg and the previous one keeps burning a core for up to 720 s
(`SegmentKeepSeconds=720`). Two checkbox flips in Playback settings.
---
## 2. Resource snapshot (3 samples, 10 s apart)
| Sample @time | jellyfin CPU% | jellyfin MEM | NET I/O | BLOCK I/O | PIDs |
|---|---:|---:|---:|---:|---:|
| t=0 | **834.3 %** | 2.635 GiB / 31.27 GiB (8.42 %) | 5.36 / 158 MB | 1.14 / 855 MB | 101 |
| t=10s | **690.5 %** | 2.637 GiB | 5.37 / 158 MB | 1.22 / 894 MB | 102 |
| t=20s | **876.7 %** | 2.646 GiB | 5.37 / 158 MB | 1.32 / 942 MB | 101 |
**Container limits:** `Memory=0 NanoCpus=0 CpuQuota=0 CpuPeriod=0
PidsLimit=<none> RestartPolicy=unless-stopped`. **No CPU or RAM cap on
the jellyfin container.** Same for the Forgejo build container.
**Host (nullstone, 12-core AMD Ryzen 5 2600X, 32 GiB RAM, 24 GiB swap):**
- `uptime`: load avg **15.43 / 14.61 / 8.85** — 1-min load 28 % above
core count. 5-min trend confirms sustained load. Doc 13 logged 11.40 /
9.59 / 6.19 ~13 h ago, so the host has been getting *worse*, not better.
- `free -h`: 31 GiB total, 10 GiB used, 8.2 GiB free, 13 GiB buff/cache;
swap **7.8 GiB / 24 GiB used** (32 %). `SwapCached=771 MB` (kernel is
actively servicing swap-in from cache — i.e. swap-thrash signature).
- `vmstat 1 5`: `r=327`, `cs=30 K41 K/s` (very high context switch
rate), `si≤24 KB/s so≈0` (paging-in but not out — recovering, not
thrashing right this second), `us=7072 % sy=1013 % id=1618 %
wa=0 %`.
- `iostat -x`: `nvme0n1` w/s ≈ 38433, `wkB/s` ≈ 3642 272, util `0.4 %
0.9 %`. **Disk is not the bottleneck — CPU is.**
**All-container CPU% (sorted, top 5):**
| Container | CPU% | MEM | Notes |
|---|---:|---:|---|
| jellyfin | **773876** | 2.6 GiB | this audit's target |
| FORGEJO-ACTIONS-TASK-202_..._Build-push-OCI | **8899** | 4.3 GiB | uncapped CI build, see §3 culprit 2 |
| traefik | 9 | 48 MiB | routine reverse proxy |
| forgejo | 9 | 207 MiB | git web |
| minecraft-mc | 7 | 4 GiB | racked.ru server |
| (28 other containers) | < 5 % combined | | none material |
The two CPU monsters together (jellyfin + bluebuild) account for **~90 %
of the 12-core host's user time** during this audit window.
---
## 3. Active sessions + active transcodes
**Sessions (within last 600 s):** **1**
| User | Client | Device | RemoteIP | NowPlaying | PlayMethod | Pos |
|---|---|---|---|---|---|---|
| s8n | Jellyfin Web | Chrome | 192.168.0.10 | Rick and Morty S01E01 — Pilot | DirectPlay (claimed) / **Transcoding** (actual) | 8 s |
**TranscodingInfo on the active session:**
```
VideoCodec → h264 (libx264, preset=veryfast, crf=23)
AudioCodec → aac (libfdk_aac, 256 kbps stereo, +6 dB volume gain)
Resolution → 3840 × 2160 (UPSCALE — source is 1080p)
Bitrate → 13.8 Mbps
Container → fmp4 / hls
HW → none
Reasons → VideoCodecNotSupported, AudioCodecNotSupported, SubtitleCodecNotSupported
Direct → IsVideoDirect=False, IsAudioDirect=False
Completion → 0 % (just started)
```
**Active ffmpeg processes on host: 4** (all for the same viewer, same
file — see §5).
The session reports `PlayMethod=DirectPlay` while *also* presenting a
`TranscodingInfo` block — Jellyfin's TS DTO returns the last-set state,
so this is the client navigating into the page; the actual decision was
**transcode** (the 4 ffmpeg's confirm). The HLS player sometimes flips
`PlayMethod=Transcode` only after the first segment downloads; pre-roll
state matches the 4-process pile-up in §5.
---
## 4. Scheduled tasks
All tasks **Idle**. None in progress. Last-run durations are tiny — no
scheduled task is the culprit. Library scan runs every 6 h (last
`14:14:04`, 0.3 s wall — only 187 items so it converges instantly).
| Name | State | Last end (UTC+1) | Last duration | Trigger |
|---|---|---|---:|---|
| Audio Normalization | Idle | 2026-05-08T00:58 | 0.0 s | IntervalTrigger |
| Clean Cache Directory | Idle | 2026-05-08T00:58 | 0.1 s | IntervalTrigger |
| Clean Log Directory | Idle | 2026-05-08T00:58 | 0.0 s | IntervalTrigger |
| Clean Transcode Directory | Idle | 2026-05-08T16:22 | 0.0 s | StartupTrigger |
| Clean up collections and playlists | Idle | 2026-05-08T16:22 | 0.0 s | StartupTrigger |
| Download missing lyrics | Idle | 2026-05-08T00:58 | 0.1 s | IntervalTrigger |
| Download missing subtitles | Idle | 2026-05-08T00:58 | 0.0 s | IntervalTrigger |
| Extract Chapter Images | Idle | 2026-05-08T01:00 | 0.0 s | DailyTrigger |
| Generate Trickplay Images | Idle | 2026-05-08T02:00 | 0.1 s | DailyTrigger |
| Media Segment Scan | Idle | 2026-05-08T14:14 | 0.0 s | IntervalTrigger |
| Optimize database | Idle | 2026-05-08T00:58 | 0.2 s | IntervalTrigger |
| Refresh Guide | Idle | 2026-05-08T00:58 | 3.2 s | IntervalTrigger |
| Refresh People | Idle | 2026-05-08T00:58 | 0.3 s | IntervalTrigger |
| Scan Media Library | Idle | 2026-05-08T14:14 | 0.3 s | IntervalTrigger |
| TasksRefreshChannels | Idle | 2026-05-08T00:58 | 0.1 s | IntervalTrigger |
| Update Plugins | Idle | 2026-05-08T16:22 | 1.2 s | StartupTrigger |
| Clean Activity Log / Keyframe Extractor / Migrate Trickplay Image Location | Idle | (never run) | — | — |
**Container restarted at 16:22:06 today** (StartupTrigger task end-times
imply a restart — last audit had `StartedAt=02:13:01`, doc 13 finding 30
expected 0 restarts). Operator likely restarted the container ~17 h ago,
not material to perf but worth noting.
**Verdict:** culprit (a) "scheduled task hogging CPU" → **ruled out**.
---
## 5. FFmpeg processes on host (snapshot)
**4 simultaneous ffmpeg processes, all transcoding the same source for
the same viewer.** This is the smoking gun. Process tree from the
container shows just `1 jellyfin` (parent) + `1579 ffmpeg` + `1725
ffmpeg` (the others are still spawning); host `ps -ef` shows 4
ffmpeg's owned by `user` uid 1000.
| PID | %CPU | %MEM | RSS | etime | What | Subs filter |
|---:|---:|---:|---:|---:|---|---|
| 1681949 | **643** | 6.9 | 2.27 GB | 53 s | `-ss 33s` HLS seek | **yes**`[0:4]scale,scale=3840:2160:fast_bilinear[sub] ; [0:0]scale=3840:2160 [main] ; overlay` |
| 1685275 | **135** | 4.4 | 1.45 GB | 6 s | `-ss 15s` HLS seek | yes — same chain |
| 1685316 | **133** | 4.4 | 1.45 GB | 6 s | full transcode (no -ss) | no — plain `setparams + scale + format=yuv420p` |
| 1685478 | **132** | 3.9 | 1.29 GB | 4 s | full transcode `-canvas_size 1920x1080` | yes — same chain |
| 1669243 (earlier sample, then died) | ~759 | 7.0 | 2.30 GB | 254 s | full transcode | no |
**What every ffmpeg is doing:**
- Decoding source 1080p H.265 (or H.264 — Pilot is x264 Bluray rip).
- **Upscaling video to 3840×2160 with `scale=...:fast_bilinear`.**
- **Burning PGS subtitle stream `0:4` ALSO upscaled to 3840×2160 onto
the video.** This is the heaviest overlay path the JF filtergraph
produces.
- Re-encoding to H.264 `libx264 preset=veryfast crf=23 high@L5.1` with
`maxrate=13.5 Mbps`.
- `-threads 0` (= use all cores), `-max_muxing_queue_size 2048`.
- HLS fmp4 segments to `/cache/transcodes/<sessionId><n>.mp4`.
**Why 4 of them at once for one user:** every time the client seeks or
reloads, JF starts a new ffmpeg with a new sessionId and a new segment
file prefix. Because `EnableThrottling=false` and
`EnableSegmentDeletion=false` (doc 13 findings 03/05), the old ffmpeg
keeps producing segments to its own cache prefix and **does not exit
until `SegmentKeepSeconds=720` elapses**. Three observed cache prefixes
right now: `8e8a8538…`, `ef1caecc…` (already produced segments 030 →
~73 MiB), `3ba3fce4…`, `b6f150cb…`, `fcc6137e…` — five session-IDs
across the last ~5 minutes for one viewer.
**Why each ffmpeg is so expensive:**
- 1080p → 4K upscale ≈ 4× pixel volume.
- PGS subtitle stream is also being scaled to 4K and overlaid (alpha
blend) every frame.
- `libfdk_aac` 256 kbps is fine, the cost is essentially all video.
- On 12 logical cores at `preset=veryfast`, this filtergraph clocks
**6.4 cores of headroom per ffmpeg** (643 % observed). Two
simultaneous transcodes = full host. Four = swap thrash + the load
avg of 15.
**Why is it upscaling to 4K at all?** Likely the client requested a
profile that picked the "max bitrate / max-resolution" capability of
the device (a desktop Chrome will report 4K-capable). The Jellyfin
ladder is either (a) "always pick highest profile" or (b) the user's
client is set to "Auto" with no max-resolution cap. **No client-side
bitrate cap is set on this user** (doc 13 reported
`RemoteClientBitrateLimit=0`). Combine that with PGS subs the client
can't render → forced burn-in → the 4K-overlay tax kicks in.
**ffprobe storms:** at 13:31 the log shows **7 simultaneous ffprobe
calls** (Mandalorian S2 episodes, all at once); at 17:37 **another 7
simultaneous ffprobes** (Mandalorian S3). Each ffprobe with
`-analyzeduration 200M -probesize 1G` reads up to 1 GiB into RAM. Cause:
operator clicked into the season 2/3 page → JF kicks subtitle-search
for every episode at once because `LibraryMetadataRefreshConcurrency=0`
(= 12). Doc 13 finding 14 already calls the concurrency-cap fix; this
audit confirms the symptom.
**Verdict:** the **single biggest user-visible "loads kinda slow"** is
the 4K-upscale subtitle-burn pile-up.
---
## 6. Plugin status
All 6 plugins **Active**. None in Faulted/Restart. No exception loops in
log from plugin assemblies.
| Name | Version | Status |
|---|---|---|
| AudioDB | 10.10.3.0 | Active |
| MusicBrainz | 10.10.3.0 | Active |
| OMDb | 10.10.3.0 | Active |
| Open Subtitles | 20.0.0.0 | Active *(but mis-configured — see §7)* |
| Studio Images | 10.10.3.0 | Active |
| TMDb | 10.10.3.0 | Active |
**Verdict:** culprit (e) "plugin throwing repeated exceptions in log
spam loop" → **partially confirmed for OpenSubtitles** (it throws on
every probe — 234 today already), but the cost is per-probe RTT not
sustained CPU. Fix is doc 13 finding 04.
---
## 7. Log error / warning summary (last 24 h, today's `log_20260508.log`)
`/config/log/log_20260508.log` is **3 968 lines**. Filtered tally:
| Pattern | Count today | Notes |
|---|---:|---|
| `[ERR]` total | **266** | |
| `[WRN]` total | **124** | |
| `Error downloading subtitles from "Open Subtitles"` | **234** | doc 13 finding 04 — `Username/Password` empty, throws `AuthenticationException` per file probed; **88 % of all errors today are this one cause** |
| `No space left on device : '/config/metadata/library/...'` | **2** | at 13:53:10 — transient ENOSPC during a metadata write; disk now 62 % full (146 GiB free), so this is a moving-target burst (probably caused by 73 MiB+ of transcode segments accumulating in `/cache/transcodes` while a metadata write tried to extend a small file). **Worth watching** but not the current bottleneck |
| `Invalid username or password entered` (auth fail) | 5 | three distinct minutes — looks like a user retrying creds, not a brute-forcer |
| `WS ... error receiving data` (websocket abrupt close) | ~25 | normal: clients closing tabs / dropping carrier. Noise, not a defect |
| `Compiling a query which loads related collections...` (EF Core warning, slow query) | 1 | EF Core's `QuerySplittingBehavior` warning — Jellyfin upstream issue, harmless on this dataset |
| `task was canceled` on `/videos/.../hls1/main/-1.mp4` | 1 (17:41) | client gave up mid-segment-init — same 499 family as doc 13's evidence |
| `SQLITE_BUSY` / `database is locked` | **0** | culprit (d) DB lock contention → **ruled out** |
**Verdict:**
- culprit (e) "plugin log spam" → confirmed (234 OS errors / day = a
scan or page-into-season triggers a loop of failures).
- culprit (d) "DB lock contention" → ruled out (0 SQLITE_BUSY).
- the **2 ENOSPC errors are NEW vs doc 13** and warrant tracking — see
§9 culprit 4.
---
## 8. DB and cache sizes
```
/config/data/jellyfin.db 288 K (was 208 K in doc 13 — fine)
/config/data/library.db 3.4 M (was 3.3 M — fine)
/config/data/library.db-wal 6.2 M (was 4.4 M — STILL LARGER THAN MAIN, see below)
/config/data/library.db-shm 32 K
/config/metadata 99 M (was 92 M — fine)
/config/log 4.2 M (was 1.3 M — 3× growth in 14 h driven by §7 OS spam)
/cache/transcodes 84 M / 43 files (snapshot)
/cache total not measurable from in-container du (mount appears empty due to bind layout)
```
**library.db-wal (6.2 MB) is now ~1.8× the main `.db` (3.4 MB).** Doc 13
finding 08 already raised this — the situation is slightly worse now
(WAL grew faster than main during 14 h). Cause: SQLite checkpoints on
*idle*, but with continuous transcode + ffprobe activity from two
viewers and library refreshes there is rarely an idle moment to
checkpoint. **Manual `Optimize database` will collapse the WAL into
the main file.**
**`/cache/transcodes` 84 MB / 43 files** is the residue of three+
abandoned ffmpeg sessions. Without `EnableSegmentDeletion=true`, this
will grow unbounded for `SegmentKeepSeconds=720` per session. Worst
case at 1 viewer × 4 zombie sessions × 720 s × 13 Mbps = **~5.6 GiB
transient cache** per minute of pile-up. **This is exactly how the
13:53 ENOSPC happened** (cache + metadata fighting for the same
146-GiB free pool).
---
## 9. Concrete remediation list (ranked impact / effort)
### 9.1 Quick-wins (rank 1 → 4 — all are minutes of work, all read-only-safe to apply)
1. **Cap two transcode flags** (Settings → Playback):
- `EnableThrottling = true`
- `EnableSegmentDeletion = true`
*Effect:* zombie ffmpeg from a stale session is killed instead of
producing 720 s of segments after the client has moved on. **This
single change directly addresses §5's 4-process pile-up.** Doc 13
already noted this; the new evidence escalates it from "S effort,
cleanup" to **"non-optional"**.
2. **Cap concurrency knobs** (Settings → Server / Library):
- `LibraryScanFanoutConcurrency = 4`
- `LibraryMetadataRefreshConcurrency = 4`
- `ParallelImageEncodingLimit = 4`
*Effect:* 7-up ffprobe burst at 13:31 / 17:37 (§5) is capped to 4
parallel probes, not 12. Doc 13 already noted this as S effort.
3. **Set `RemoteClientBitrateLimit`** (Dashboard → Playback → Streaming
→ "Internet streaming bitrate limit"):
- Suggest `8 Mbps` (covers 1080p Bluray rips, kills 4K-upscale
decisions on remote sessions). LAN clients that want full-bitrate
can be flagged via per-user policy.
*Effect:* the 13.8 Mbps maxrate-on-WAN session becomes a 8-Mbps
session that **doesn't need the 4K-upscale path** — JF stops asking
ffmpeg to produce 3840×2160. **This is what makes §5's per-stream
cost drop by ~70 %.** Independent of GPU.
4. **Disable Open Subtitles plugin OR populate creds** (already in
doc 13 finding 04). Removes 234 ERR/day, restores log signal,
removes the per-probe RTT.
### 9.2 Investments (rank 5 → 7 — half-day to multi-day, structural)
5. **Add CPU + memory limits to BOTH `jellyfin` and the Forgejo
`BlueBuild` build container in compose** — currently both are
uncapped, fighting for the same 12 cores. Suggest:
- `jellyfin`: `cpus: 8.0`, `mem_limit: 12G`, `mem_reservation: 4G`
- `forgejo-runner` build pods: `cpus: 4.0`, `mem_limit: 8G`
*Effect:* a noisy CI build cannot drag interactive playback
latency to the floor; viewer always has 8 cores even when
BlueBuild is hot. Note that the BlueBuild container is short-lived
(forgejo-actions spawns it per job) so the limit goes on the
runner's `container_options` in the runner config, not on a static
compose service.
6. **Re-enable GPU transcoding on host** (doc 13 finding 02 — L
effort). With H.264 NVENC at preset `p4` the same filtergraph
collapses from ~6.4 CPU cores to ~0.3 CPU cores + GPU. Without
GPU, the §5 quick-wins are the cap; with GPU, the host can
serve 4 simultaneous viewers comfortably.
7. **Cap the maximum supported resolution in client policy** (Dashboard
→ Users → each user → Playback → "Maximum allowed video bitrate" /
"Maximum allowed video resolution"). Set non-admin users to
`1080p` max. Closes the foot-gun where any client says "I can do
4K" and Jellyfin obliges with a 4K-upscale CPU bomb.
### 9.3 Watch-list (no immediate action, monitor next audit)
- ENOSPC at 13:53 (only 2 occurrences, host has 146 GiB free now, so
it was a transient pressure burst). Re-check post-quick-wins (1+2
remove the cache pile-up that caused it).
- `library.db-wal` 1.8× main db — manual `Optimize database` after the
above tasks finish, or tighten its schedule from 24 h to 6 h.
- Container restart at 16:22 (was 02:13 in doc 13) — was this operator-
initiated or did `unless-stopped` re-spin a crash? Check
`docker logs jellyfin --since 6h` for `panic`/`crash` next time.
---
## 10. Quick-win vs investment summary
| Bucket | Action | Effort | Expected impact |
|---|---|---|---|
| **Quick-win** | Throttling + SegmentDeletion ON | 2 clicks | Kills §5 zombie ffmpegs immediately; expected load avg drop 4050 % under one active viewer |
| **Quick-win** | Concurrency caps 12 → 4 | 3 fields | Removes the 7-up ffprobe bursts at season-page navigation |
| **Quick-win** | RemoteClientBitrateLimit = 8 Mbps | 1 field | Stops Jellyfin choosing 4K-upscale paths for WAN clients; ~70 % drop in per-stream CPU |
| **Quick-win** | OpenSubs disable / cred | 30 sec | 234 ERR/day → 0; cleaner log; faster library scans |
| **Investment** | Compose CPU/MEM caps for jellyfin + bluebuild | 30 min compose + 1 restart per container | Removes noisy-neighbor head-of-line blocking by the CI runner |
| **Investment** | GPU transcode reactivation | days (driver work, host) | 20× per-stream CPU efficiency on the 1080p-and-up paths |
| **Investment** | Per-user max-resolution policy | 5 min × N users | Prevents admin foot-gun and any future invitee from triggering the 4K-upscale path |
---
## Appendix — raw evidence
### Container limits (the absence is the finding)
```
docker inspect jellyfin --format '{{.HostConfig.Memory}} {{.HostConfig.NanoCpus}}
{{.HostConfig.CpuQuota}} {{.HostConfig.CpuPeriod}}
{{.HostConfig.PidsLimit}} {{.HostConfig.RestartPolicy.Name}}'
→ 0 0 0 0 <no value> unless-stopped
```
### Host CPU + load + memory
```
nproc: 12
lscpu Model: AMD Ryzen 5 2600X Six-Core Processor (6c / 12t, NUMA0=011)
uptime: 17:42:14 up 4 days 17:59, 2 users, load average: 15.43, 14.61, 8.85
free -h: total 31Gi, used 10Gi, free 8.2Gi, buff/cache 13Gi
swap total 24Gi, used 7.8Gi (32 %), SwapCached 789 976 kB
vmstat 1 5 (us / sy / id / wa, last sample): 71 / 13 / 16 / 0
r=11, b=1, cs ≈ 30 K/s
iostat (nvme0n1): 38433 w/s, 3642 272 wkB/s, util 0.40.9 % — disk idle
```
### Top hosts on host (snapshot)
```
ps -eo pid,user,pcpu,pmem,rss,etimes,args --sort=-pcpu | head:
1681949 user 643 % 6.9 % 2.30 GB 53 s ffmpeg [Rick & Morty S01E01, 4K-upscale + sub burn]
1662267 root 52 % 0.1 % — 426 s fuse-overlayfs (BlueBuild rootfs mount)
1661952 root 36 % 0.1 % — 431 s fuse-overlayfs (BlueBuild rootfs)
1485847 git 8 % 0.8 % 266 MB — gitea web (forgejo)
364785 user 8 % 2.6 % 867 MB — openclaw-gateway
1901802 java 8 % 12.7 % 4.2 GB — minecraft jvm (-Xmx14336M)
1660709 root 7 % 0.3 % 100 MB 442 s buildah build (BlueBuild)
1626511 user 4 % 1.6 % 544 MB — /jellyfin/jellyfin (server proc)
```
### All 4 active ffmpeg's (full filter chain shown for the heaviest one)
```
PID 1681949 (643 % CPU):
-ss 33s -noaccurate_seek -canvas_size 1920x1080
-i Rick.and.Morty.S01E01.mkv
-threads 0 -map 0:0 -map 0:1 -map -0:0
-codec:v libx264 -preset veryfast -crf 23 -maxrate 13546858 -bufsize 27093716
-profile:v high -level 51
-filter_complex
[0:4]scale,scale=3840:2160:fast_bilinear[sub] ;
[0:0]setparams=color_primaries=bt709:color_trc=bt709:colorspace=bt709,
scale=trunc(min(max(iw,ih*a),min(3840,2160*a))/2)*2
:trunc(min(max(iw/a,ih),min(3840/a,2160))/2)*2,
format=yuv420p[main] ;
[main][sub]overlay=eof_action=pass:repeatlast=0
-codec:a libfdk_aac -ac 2 -ab 256000 -af volume=2
-f hls -hls_time 3 -hls_segment_type fmp4 ...
```
### Sessions API (exactly 1 user, mismatched `PlayMethod` vs `TranscodingInfo`)
```
GET /Sessions?activeWithinSeconds=600 → 1 session
user=s8n client=Jellyfin Web/Chrome remote=192.168.0.10
PlayMethod=DirectPlay (claimed)
TranscodingInfo:
VideoCodec=h264 AudioCodec=aac Container=fmp4/hls
3840x2160 @ 13.8 Mbps HW=none IsVideoDirect=False IsAudioDirect=False
Reasons = [VideoCodecNotSupported, AudioCodecNotSupported, SubtitleCodecNotSupported]
Completion = 0.0 %
```
### Scheduled tasks (none in progress)
(Full table in §4 — every task is `Idle`, last-run durations 03.2 s.)
### Plugins (all 6 Active, no faulted)
```
AudioDB 10.10.3.0 Active
MusicBrainz 10.10.3.0 Active
OMDb 10.10.3.0 Active
Open Subtitles 20.0.0.0 Active ← 234 ERR/day from auth-empty creds (doc 13 finding 04)
Studio Images 10.10.3.0 Active
TMDb 10.10.3.0 Active
```
### Log tally (today's `log_20260508.log`, 3 968 lines)
```
[ERR] lines: 266
[WRN] lines: 124
"Error downloading subtitles from Open Subtitles": 234 ← 88 % of all ERR
"No space left on device": 2 ← 13:53:10, transient
"Invalid username or password entered" (login): 5
"WS ... error receiving data": ~25 ← noise
"task was canceled" / 499: 1 ← 17:41
"SQLITE_BUSY" / "database is locked": 0
EF Core "QuerySplittingBehavior" warning: 1 ← upstream JF
```
### Disk (host vs container view)
```
host df -h /home: 399G 233G 146G 62 % (was 90 % in doc 13 — improved)
host df -i /home: ~1.49M used / ~26.6M 6 % inodes healthy
container df -h /config /cache /media: same FS, same 146G free
```
### Items / counts
```
GET /Items/Counts → MovieCount=2 SeriesCount=6 EpisodeCount=181
ArtistCount=0 ProgramCount=0 TrailerCount=0
SongCount=0 AlbumCount=0 MusicVideoCount=0
```
### Container restart (StartedAt today)
```
Implied from ScheduledTasks where Trigger=StartupTrigger:
Clean Transcode Directory → end 16:22:06 ← container start ≈ 16:22:05
Clean up collections and playlists → end 16:22:06
Update Plugins → end 16:22:07
(doc 13 had StartedAt = 02:13:01)
```
### Forgejo BlueBuild container (noisy neighbor, no limits)
```
docker stats: CPU 8899 % MEM 4.3 GiB NET in 5 GB BLOCK in/out 296 MB / 35.3 GB
docker inspect: Memory=0 NanoCpus=0 CpuQuota=0 ← uncapped
```
---
## Sign-off
- Audit: 2026-05-08, read-only, ~15 min wall.
- No fixes applied. No state mutated. No container restart. No plugins
reloaded. No tasks executed.
- Scope respected: server runtime only. Color/HDR, edge/network, and
storage findings deferred to sibling agents.
- Next audit due: **2026-08-08** (quarterly, paired with doc 13).

View file

@ -1,587 +0,0 @@
# 23 — ARRFLIX Edge / Network / Browser-Load-Path Audit
> Status: **read-only audit**, executed 2026-05-08 from onyx
> (192.168.0.6 LAN) against `https://arrflix.s8n.ru` (Jellyfin 10.10.3
> behind Traefik on nullstone). Scope: edge — DNS, TLS, Traefik,
> compression, cache headers, asset waterfall, ServiceWorker. **No
> fixes applied. No state mutated. No container restart. No Traefik
> reload.**
>
> Sibling audits cover color/HDR, server runtime, and storage. This one
> is the edge slice only. Pairs with doc 13 (server-side optimization
> audit, 2026-05-08) — that one calls out CPU/transcode; this one
> identifies why every page-nav over WAN feels gluey when the server is
> idle.
---
## 1. Executive summary
The two cold-load complaints ("loads kinda slow") are dominated by a
single edge defect with three symptoms:
1. **No HTTP compression at all.** Traefik has zero `compress`
middleware defined in either static (`traefik.yml`) or dynamic
(`config/dynamic.yml`) config, and the Jellyfin router only attaches
`security-headers@file`. Result: Jellyfin's 28 webpack JS bundles
ship raw — **2.74 MiB of JS over the wire on every cold load**.
With gzip (default ratio ~0.30 for minified JS) that drops to
~0.82 MiB. With brotli (~0.25) it drops to ~0.69 MiB. **Severity:
R — fix this week.**
2. **No `Cache-Control` on hashed-asset URLs.** Every JS bundle
comes back with `etag` + `last-modified` and **no** `cache-control`
header. Browsers default to "heuristic freshness" (typically 10 % of
`last-modified` age) but every reload **does still issue a
conditional `If-None-Match` request per asset** and gets a 304
back. On a cold-cache page-nav that's **28 round-trips of pure
negotiation overhead** even when the response body is cached.
These URLs are content-hashed (`?7dc095d8…`), so they should be
`Cache-Control: public, max-age=31536000, immutable`. **Severity:
Y → R when WAN clients are involved (each round-trip costs an
internet RTT).**
3. **Poster image first-fetch is slow** — the very first cold request
for a `/Items/{id}/Images/Primary` triggers a Jellyfin server-side
image transcode (resize + JPEG re-encode) and runs in **~385 ms
wall** vs **~2535 ms** for warm cache. With ~20 posters on the home
page and no edge cache, the first visit to "Recently Added" is a
**~7-second poster grid**. Doc 13 finding 23 (3 MB splash PNG) is
the loud single hit; this is the death-by-a-thousand-cuts equivalent
for the home page. **Severity: Y.**
Everything else (TLS handshake, MTU, DNS lookup, HTTP/2 vs HTTP/3,
cert chain depth, Traefik middleware chain, Pi-hole hairpin) is
healthy or low-impact — full table below.
**Top quick win:** add a `compress@file` middleware in
`/opt/docker/traefik/config/dynamic.yml` and reference it from the
Jellyfin router. **One file edit. Two lines of YAML in the middleware
block, one line on the router. ~70 % cold-load wire-size reduction.**
---
## 2. Curl timing breakdown (5 samples, p50, p95)
Test: `curl https://arrflix.s8n.ru/web/index.html` from onyx.
### LAN-direct (`--resolve` to 192.168.0.100)
| Sample | DNS | CONN | TLS | TTFB | TOTAL |
|--------|-----|------|-----|------|-------|
| 1 | 0.000024 | 0.001225 | 0.022960 | 0.031569 | 0.040531 |
| 2 | 0.000024 | 0.001217 | 0.020182 | 0.024190 | 0.030353 |
| 3 | 0.000028 | 0.001437 | 0.025502 | 0.030467 | 0.035793 |
| 4 | 0.000023 | 0.001501 | 0.021998 | 0.037444 | 0.041056 |
| 5 | 0.000023 | 0.001265 | 0.018536 | 0.022942 | 0.027066 |
| **p50** | **24 µs** | **1.27 ms** | **22.0 ms** | **30.5 ms** | **35.8 ms** |
| **p95** | **28 µs** | **1.50 ms** | **25.5 ms** | **37.4 ms** | **41.1 ms** |
### Hostname (onyx /etc/hosts → 192.168.0.100)
| p50 | DNS 0.34 ms | CONN 1.6 ms | TLS 23.0 ms | TTFB 27.5 ms | TOTAL 33.7 ms |
### Notes
- DNS via `/etc/hosts` adds ~300 µs vs `--resolve`. Negligible.
- TLS handshake is the dominant cost (≥60 % of TTFB). TLS 1.3 with
`TLS_AES_128_GCM_SHA256`, **2-cert chain depth** (Let's Encrypt R13
→ ISRG Root X1), no avoidable latency there. Connection reuse will
hide it on subsequent requests within the same browser session.
- **TTFB ≤ 40 ms even on cold connection** — server-side latency for
the index.html body itself is fine. The "feels slow" perception is
**not** in this number; it's in the 28-bundle waterfall after
index.html.
---
## 3. Compression / cache header table
Probed with `Accept-Encoding: gzip, br, zstd`. Every asset was
served raw.
| Asset | Type | Bytes | Encoding | Cache-Control | ETag |
|-------|------|------:|----------|---------------|------|
| `/web/index.html` | text/html | 65 485 | **none** | **(none)** | yes |
| `/web/runtime.bundle.js?…` | text/js | 49 152 | **none** | **(none)** | yes |
| `/web/main.jellyfin.bundle.js?…` | text/js | 499 108 | **none** | **(none)** | yes |
| `/web/node_modules.@jellyfin.sdk.bundle.js?…` | text/js | 740 699 | **none** | **(none)** | yes |
| `/web/node_modules.@mui.material.bundle.js?…` | text/js | 381 100 | **none** | **(none)** | yes |
| `/web/node_modules.core-js.bundle.js?…` | text/js | 182 469 | **none** | **(none)** | yes |
| `/web/node_modules.react-dom.bundle.js?…` | text/js | 128 970 | **none** | **(none)** | yes |
| `/web/node_modules.@tanstack.query-core.bundle.js?…` | text/js | 101 747 | **none** | **(none)** | yes |
| `/web/node_modules.lodash-es.bundle.js?…` | text/js | 24 604 | **none** | **(none)** | yes |
| `/web/themes/dark/theme.css` | text/css | 8 631 | **none** | **(none)** | yes |
| `/web/manifest.json` | json | 781 | **none** | **(none)** | yes |
| `/web/serviceworker.js` | text/js | 768 | **none** | **(none)** | yes |
| `/web/favicon.ico` | image/x-icon | 6 830 | **none** | **(none)** | yes |
| `/web/touchicon.png` | image/png | 8 515 | **none** | **(none)** | yes |
| `/Items/.../Images/Primary` (cold) | image/jpeg | ~46 000 | **none** | `public` (no max-age) | — |
Verification — index.html negotiated against four different
`Accept-Encoding` headers. All four returned `content-length: 65485`
and **no** `content-encoding` — confirms Traefik isn't selectively
disabling compression by `User-Agent`/path; the middleware simply
isn't in the chain.
ETag-revalidation works correctly: a follow-up
`If-None-Match: "1db3a353daaafa4"` returns `HTTP/2 304` immediately —
so warm-load is "fast" only because nothing has changed since cold
load. The browser still pays a round-trip per asset.
---
## 4. Asset cold-load waterfall (top by size)
`/web/index.html` references **28 webpack-emitted JS bundles** (full
list at `/tmp/edge-audit/bundles.txt` during audit; file generated by
parsing `<script src=…>` tags in index.html and discarded after
report). All 28 share the same query-string version
`?7dc095d8f634f60f309c` — they ARE content-versioned URLs and SHOULD
be cached `immutable`.
| Rank | Bundle | Bytes | Notes |
|---:|---|---:|---|
| 1 | `node_modules.@jellyfin.sdk.bundle.js` | 740 699 | Largest single file. Compresses ~70 %. |
| 2 | `main.jellyfin.bundle.js` | 499 108 | App bundle. Compresses ~70 %. |
| 3 | `node_modules.@mui.material.bundle.js` | 381 100 | MUI components. Compresses ~75 %. |
| 4 | `node_modules.core-js.bundle.js` | 182 469 | Polyfills. Compresses ~75 %. |
| 5 | `node_modules.react-dom.bundle.js` | 128 970 | React DOM. Compresses ~75 %. |
| 6 | `node_modules.@tanstack.query-core.bundle.js` | 101 747 | React-Query. Compresses ~70 %. |
| 7 | `node_modules.jellyfin-apiclient.bundle.js` | 88 025 | Compresses ~70 %. |
| 8 | `node_modules.jquery.bundle.js` | 87 296 | Compresses ~70 %. |
| 9 | `node_modules.axios.bundle.js` | 80 291 | Compresses ~70 %. |
| 10 | `node_modules.date-fns.esm.bundle.js` | 74 309 | Compresses ~70 %. |
| 11 | `node_modules.@remix-run.router.bundle.js` | 72 992 | |
| 12 | `37869.bundle.js` | 70 690 | Lazy chunk. |
| 13 | `runtime.bundle.js` | 49 152 | Webpack runtime. |
| 14 | `node_modules.webcomponents.js.bundle.js` | 39 705 | |
| 15 | `node_modules.@mui.icons-material.bundle.js` | 30 861 | |
| — | (13 more bundles, each 530 KB) | ~351 000 | |
| **Total JS** | **28 bundles** | **2 806 173** | **(2.68 MiB raw)** |
| + | `index.html` | 65 485 | |
| + | `theme.css` + assets | ~32 000 | |
| **Cold-load total** | | **~2.76 MiB** | **uncompressed** |
Wall-time measurements from onyx (LAN-direct, sequential):
- **5 top bundles, sequential GET, LAN:** 0.37 s for 1.65 MiB.
- **All 28 bundles, sequential GET, LAN:** 1.51 s for 2.68 MiB.
A real browser uses HTTP/2 multiplexing so won't be strictly
sequential, but `connection-window` + `flow-control` mean wire-time on
WAN scales nearly linearly with total bytes. Compression alone would
cut wire-time ~70 %.
Estimated post-compression total: **~0.82 MiB** (gzip) or **~0.69 MiB**
(brotli). At a 50 Mbps WAN, that's a 200300 ms cold-load saving
*before* any RTT improvements from cache headers.
---
## 5. ServiceWorker warm-load effectiveness
**Conclusion: SW does NOT cache app assets.** Verified by reading
`/web/serviceworker.js` (768 B, last modified 2024-11-19 — Jellyfin
10.10.3 ship date).
The SW only handles `notificationclick` events (cancel-install /
restart-server actions) and a one-shot `activate``clients.claim()`.
There is **no `fetch` handler**, no `install` precache, no asset
caching at all. This matches doc 13 finding 11.
So the warm-load is doing exactly what the browser HTTP cache + ETag
flow gives us: 28 conditional GETs, each returning 304 with empty
body but a full TLS-multiplexed round-trip. With proper
`Cache-Control: max-age=31536000, immutable` on the hashed URLs,
all 28 of those revalidations would collapse into zero network
traffic on warm load.
---
## 6. Poster image timing
Tested against Rick and Morty series ID
`548035d5e4d36cd2f488900ab612581a`,
`/Items/{id}/Images/Primary?fillHeight=300&fillWidth=200&quality=96`.
| Request | TTFB | TOTAL | Bytes |
|---|---:|---:|---:|
| **Cold (uncached size variant)** | 385 ms | 388 ms | 45 660 |
| Warm 1 | 26 ms | 29 ms | 45 660 |
| Warm 2 | 38 ms | 42 ms | 45 660 |
| Warm 3 | 34 ms | 38 ms | 45 660 |
| Warm 4 | 37 ms | 42 ms | 45 660 |
| **Cold h=400** | — | 351 ms | 79 925 |
| **Cold h=500** | — | 469 ms | 112 168 |
| **Cold h=600** | — | 364 ms | 145 505 |
Response headers:
```
HTTP/2 200
age: 0
cache-control: public ← no max-age
content-disposition: attachment ← unusual on a poster (forces 'save')
content-type: image/jpeg
last-modified: <request time> ← unhelpful for caching
vary: Accept
```
Two issues here:
- **`Cache-Control: public` with no `max-age`** means the browser
applies heuristic freshness (10 % of last-modified age = 0 s, since
last-modified equals the response time). Effectively uncached. Every
navigation back to the home page re-fetches all posters.
- **Server-side image transcode is the dominant cost.** Jellyfin
generates the `fillHeight=300&fillWidth=200&q=96` variant on demand
from the source poster image, then caches it in
`/cache/images/`. `age: 0` on response confirms this was a fresh
generation. Doc 13 finding 26 puts the on-disk image cache at 15 MB
total — small enough that recent-cache eviction may be culling
variants.
Per-poster cold cost: 350470 ms. Twenty posters at unique
`fillHeight` × `fillWidth` × `quality` variants on the first load
of "Recently Added" totals **~7 s** if the browser drops to single-
threaded poster fetches (HTTP/2 multiplexes, so true cost is
GPU/CPU-bound on the server side). Doc 13 finding 02 (no GPU
transcode, 12-core box already at load 11.4) means even this is
software-rendered.
`content-disposition: attachment` on an image fetched into an
`<img>` tag doesn't actually force a download (the browser ignores
the disposition for media references), but it's a Jellyfin-side
oddity worth noting.
---
## 7. Traefik request-log latency analysis
`docker logs traefik --since 6h | grep jellyfin@docker` — total 116
requests, 78 of them at 0 ms (cache hits / 304s / 401s).
Latency histogram (ms suffix on each log line):
| Bucket | Count |
|---|---:|
| 0 ms | 78 |
| 1 ms | 8 |
| 3 ms | 1 |
| 7 ms | 1 |
| 1846 ms | 4 |
| 92294 ms | 5 |
| 346648 ms | 4 |
| 1.12.1 s | 3 |
| 4.99.5 s | 4 |
**Every entry above ~50 ms is a `/videos/.../hls1/main/*.mp4`
HLS-segment GET, not a `/web/*` static asset.** Decoded request
URIs show the slow ones are AV1 + HEVC transcode requests with
`VideoBitrate=362547 Mbit` and 500/499 final status — exactly the
pattern doc 13 finding 03 calls out (CPU-only transcode + no
throttling). Edge layer is clean: every `/web/*` request that
appeared in the 6-hour window completed in 07 ms wall.
Status code distribution for `jellyfin@docker` (6 h):
| Code | Count |
|---:|---:|
| 200 (filtered out by accessLog statusCodes 400-599) | (not logged) |
| 400 | 1 |
| 401 | 7 |
| 404 | 68 |
| 405 | 8 |
| 499 | 15 |
| 500 | 8 |
| 502 | 1 |
The 68 × 404 are mostly `Cineplex/CSS/icon` references from the bundled
theme @import-ing assets that Jellyfin doesn't ship — cosmetic, but
each 404 is a wasted RTT on every cold-load (browser fetches the
referenced URL, gets 404, retries on next page nav). Worth a separate
look in coordination with doc 09 (Cineplex theme).
---
## 8. Traefik middleware audit
### Static config (`/opt/docker/traefik/traefik.yml`)
```yaml
entryPoints:
websecure:
address: ":443"
http:
middlewares:
- security-headers@file
- rate-limit@file
```
### Jellyfin router (`/opt/docker/jellyfin/docker-compose.yml`)
```yaml
labels:
- "traefik.http.routers.jellyfin.middlewares=security-headers@file"
```
### Effective middleware chain at `/web/*` request
1. `security-headers@file` (entrypoint) — header rewrites, no body
processing, ~zero CPU.
2. `rate-limit@file` (entrypoint) — token-bucket avg=100 burst=200
period=1s. Pure counter, ~zero CPU. **Not** a regex chain. **Not**
doing CPU-significant work.
3. `security-headers@file` (router, **duplicate**) — applied a second
time to the response. Idempotent (header overwrite is a no-op when
value already set), but **redundant** and a small CPU waste
per-request. Worth deduping.
### What's missing
- **`compress` middleware**. Traefik supports it with a one-liner:
```yaml
middlewares:
compress:
compress: {}
```
Defaults to gzip + brotli, sizes ≥1024 B, smart `Accept-Encoding`
negotiation. Not present anywhere.
- **No `headers.customResponseHeaders.Cache-Control`** override on
the Jellyfin router — Traefik would let us inject
`Cache-Control: public, max-age=31536000, immutable` for
`/web/*.bundle.js?*` requests via a `replacePath`-+-`headers`
combination, OR (cleaner) Jellyfin can be configured to send the
right headers itself; this is config not architecture.
### Traefik middleware chain on other Jellyfin paths
The `no-guest@file` allowlist seen in dynamic.yml is **not** referenced
by the Jellyfin router (per doc 09 §1.2 it was intentionally dropped
when WAN exposure was added). That matches expectation; not an edge
performance issue.
The `headscale-deny-leaks` and `signup-strict` middlewares are
defined but only referenced by other routers. No effect on Jellyfin.
---
## 9. DNS / hairpin / MTU
| Probe | Result | Verdict |
|---|---|---|
| Pi-hole DNS lookup `dig arrflix.s8n.ru @192.168.0.1` | returns **`82.31.156.86` (WAN)** | **Y — split-horizon missing.** Onyx's `/etc/hosts` pin saves it; any LAN client without that entry hairpins through the router. |
| Onyx hairpin to WAN IP, full TTFB | 3343 ms | **G — hairpin works, no NAT-loopback latency penalty.** |
| LAN MTU `ping -M do -s 1472 -c 3` | 1480/1480/1480, 1.171.75 ms | **G — full 1500 MTU, no fragmentation, no PMTUD penalty.** |
| `--resolve` LAN-direct vs hostname | DNS adds 300 µs | **G — negligible.** |
The Pi-hole gap is a doc-09-related exposure decision: arrflix.s8n.ru
has public DNS on Gandi pointing at the WAN IP, no Pi-hole local
override. For an LAN-first deploy you'd add a local DNS rewrite
`arrflix.s8n.ru → 192.168.0.100` on Pi-hole. Per memory note
`feedback_s8n_hosts_override.md`, this is a known pattern (`/etc/hosts`
pin on each device works, but doesn't scale to phones).
---
## 10. HTTP/2, HTTP/3, TLS
- **HTTP/2:** confirmed (`HTTP/2 200` response, multiplexing
available).
- **HTTP/3 (QUIC):** **not enabled.** `Alt-Svc` header is absent on
every probe. (My local libcurl doesn't support `--http3` so I can't
client-test, but the lack of `Alt-Svc` advertises that the server
doesn't speak QUIC.) Traefik ≥ 2.8 supports HTTP/3 via experimental
`entryPoints.websecure.http3 = {}` block; not enabled in
`traefik.yml`. **Y — would help WAN clients on lossy links** (mobile
data, café WiFi); near-zero benefit on LAN.
- **TLS chain:** 2 certs (leaf + LE R13 intermediate) → ISRG Root X1
is in client trust store. Chain length is minimal; not contributing
to handshake latency.
- **TLS version:** 1.3 with AEAD cipher (`TLS_AES_128_GCM_SHA256`).
- **`sniStrict: true`** in dynamic.yml's `tls.options.default`. Correct.
---
## 11. Concrete remediation list (ranked by impact-per-effort)
| # | Fix | Effort | Impact | Risk |
|---:|---|:-:|---|---|
| **1** | **Add `compress@file` middleware** in `/opt/docker/traefik/config/dynamic.yml`: `compress: {}` under `http.middlewares.compress`. Reference it from the Jellyfin router via a `traefik.http.routers.jellyfin.middlewares=security-headers@file,compress@file` label edit in `/opt/docker/jellyfin/docker-compose.yml`. | **S** (5 min) | **~70 % cold-load wire reduction** (2.74 MiB → ~0.82 MiB). Lowers TTI on every single first-visit. | Low — Traefik's `compress` is a standard middleware, gzip+br, content-type allow-list does the right thing for `application/javascript` + `text/html` + `text/css`. Will not compress `image/jpeg`. |
| **2** | **Add `Cache-Control: public, max-age=31536000, immutable` for `/web/*.bundle.js?*` and `/web/*.css?*` requests.** Cleanest path is via Traefik `headers` middleware with `customResponseHeaders.Cache-Control` and a router rule that matches `Path(\`/web/\`) && Query(\`hash\`)` — but Jellyfin can also be patched at the source if there's appetite. | SM | Eliminates 28 × per-page-nav round-trips for warm load. Saves ~28 RTTs (~1.5 s on a 50-ms WAN link, ~0 on LAN). | Medium — must scope ONLY to hashed URLs; if `Cache-Control: immutable` is applied to `index.html` you brick the next deploy until users force-reload. |
| **3** | **Enable HTTP/3 / QUIC.** Add `entryPoints.websecure.http3 = {}` to `traefik.yml`, expose UDP 443 on the host, and add an `Alt-Svc: h3=":443"; ma=86400` header (Traefik does this automatically once the HTTP/3 entrypoint is on). | M | Marginal on LAN, real on lossy WAN (3G, café WiFi). Cuts TLS handshake to 1-RTT. | Low — Traefik HTTP/3 has been stable since v3.0; coexists with H/2. Need to open UDP 443 on nullstone firewall + router port-forward. |
| **4** | **Tighten poster image cache.** Either set `Cache-Control: public, max-age=86400` on `/Items/*/Images/Primary` responses (Jellyfin-side via `system.xml` `MaxResumePct` style — actually a Jellyfin web-server-config patch), or put a Traefik-level `headers.customResponseHeaders.Cache-Control` on `Path(\`/Items/\`) && PathPrefix(\`/Images/\`)`. Even 1 hour of caching collapses the poster grid re-fetch on home-page bounce-back. | SM | ~7 s saved on home-page revisit when posters were already fetched. | Low — posters are content-addressed by `?fillHeight=…&quality=…`; safe to cache. |
| **5** | **Dedupe security-headers middleware.** Remove the entrypoint-level `security-headers@file` OR remove it from each per-router label. (Cleanest: keep it at entrypoint level, drop from labels.) | S | Tiny (microseconds per request). Cleanup, not perf. | Low. |
| **6** | **Add Pi-hole local DNS rewrite for `arrflix.s8n.ru` → `192.168.0.100`.** Memory note `feedback_s8n_hosts_override.md` already covers this pattern. Onyx `/etc/hosts` works but doesn't scale to phones / friends' devices. | S | Stops LAN clients hairpinning through router on every fetch. Saves 1× NAT-loopback round-trip per TCP connection (~2 ms — small but free). | Low. |
| **7** | **Investigate the 68 × 404 in 6 h on `/web/*`.** Likely Cineplex theme @import or icon references with bad paths. Each 404 is a wasted RTT on cold-load. | S | Small but cumulative on cold-load. | Low — read-only investigation first. |
| **8** | **Strip `content-disposition: attachment` on Image responses.** Jellyfin emits this on every `/Images/Primary` GET. Browser ignores it for `<img>` references but it's hostile if anyone right-clicks "open image in new tab". | S | Cosmetic. | Low. |
### Recommended fix order
The order **#1#2#3** is the entire cold-load story. **#1
alone** turns "kinda slow" into "fine" for 90 % of the perceived
latency on first load. **#2** turns 2nd-page-nav into "instant" by
eliminating the 28-asset revalidation tax. **#3** is the WAN-optimist
nice-to-have; do once mobile clients matter.
Out of scope for this audit but worth noting from doc 13: GPU
transcode re-enable (#02 there) is the real win for *playback*
latency. Cold-load + playback are separate paths; both need
attention.
---
## 12. Out of scope (audited and found healthy)
- **TLS handshake latency** (2225 ms LAN, normal for TLS 1.3 fresh
handshake; reuse hides it).
- **Cert chain depth** (2-cert chain, R13 intermediate).
- **MTU** (1500, no fragmentation).
- **HTTP/2** (working, multiplexed).
- **DNS lookup** (300 µs via /etc/hosts; 20160 ms first time via
Pi-hole, cached after).
- **Hairpin NAT** (works, no extra latency).
- **`rate-limit@file` middleware** (token-bucket, ~zero overhead).
- **Sniff/CSP/STS/frame headers** — set correctly, no perf cost.
- **ServiceWorker** (notification-only, no perf-positive nor
perf-negative).
- **Traefik access log filter** (statusCodes 400-599 only — does NOT
log the 200 OK responses that dominate `/web/*`; the latency
histogram in §7 is therefore a 5xx/4xx-only sample, not full
traffic. The 5xx/4xx sample is conclusive enough for edge analysis
because all the slow ones are HLS transcode failures, not edge
problems).
---
## Appendix — raw evidence
### Curl timing (LAN-direct, 5 samples)
```
DNS=0.000024 CONN=0.001225 TLS=0.022960 TTFB=0.031569 TOTAL=0.040531
DNS=0.000024 CONN=0.001217 TLS=0.020182 TTFB=0.024190 TOTAL=0.030353
DNS=0.000028 CONN=0.001437 TLS=0.025502 TTFB=0.030467 TOTAL=0.035793
DNS=0.000023 CONN=0.001501 TLS=0.021998 TTFB=0.037444 TOTAL=0.041056
DNS=0.000023 CONN=0.001265 TLS=0.018536 TTFB=0.022942 TOTAL=0.027066
```
### Compression negotiation matrix
```
Accept-Encoding: br → content-length: 65485, no content-encoding
Accept-Encoding: gzip → content-length: 65485, no content-encoding
Accept-Encoding: (empty) → content-length: 65485, no content-encoding
Accept-Encoding: gzip,deflate,br,zstd --compressed → content-length: 65485, no content-encoding
```
### TLS chain
```
depth=2 C=US, O=Internet Security Research Group, CN=ISRG Root X1
depth=1 C=US, O=Let's Encrypt, CN=R13
depth=0 CN=arrflix.s8n.ru
Verification: OK
Protocol: TLSv1.3
Cipher: TLS_AES_128_GCM_SHA256
```
### ETag-conditional revalidation
```
First fetch: HTTP/2 200, etag "1db3a353daaafa4", content-length 499108
If-None-Match: HTTP/2 304, etag "1db3a353daaafa4", body empty
```
### Bundle inventory (28 bundles, total 2 806 173 bytes)
Top 15 by size — see §4 table. Full list reproducible from
`curl -s https://arrflix.s8n.ru/web/index.html | grep -oE 'src="[^"]*\.bundle\.js[^"]*"'`.
### Poster image fetch (5 samples — first cold, rest warm)
```
TTFB=0.385230s TOTAL=0.388290s SIZE=45660b ← cold (server transcode)
TTFB=0.025961s TOTAL=0.028951s SIZE=45660b
TTFB=0.037838s TOTAL=0.041724s SIZE=45660b
TTFB=0.034244s TOTAL=0.038364s SIZE=45660b
TTFB=0.036687s TOTAL=0.041616s SIZE=45660b
```
### Traefik static config (entrypoints)
```yaml
websecure:
address: ":443"
http:
middlewares:
- security-headers@file
- rate-limit@file
```
### Jellyfin router labels (compose)
```yaml
"traefik.http.routers.jellyfin.middlewares=security-headers@file"
"traefik.http.services.jellyfin.loadbalancer.server.port=8096"
```
### MTU + ping
```
PING 192.168.0.100 (192.168.0.100) 1472(1500) bytes of data
1480 bytes from 192.168.0.100: icmp_seq=1 ttl=64 time=1.66 ms
1480 bytes from 192.168.0.100: icmp_seq=2 ttl=64 time=1.75 ms
1480 bytes from 192.168.0.100: icmp_seq=3 ttl=64 time=1.17 ms
0 % packet loss, rtt min/avg/max/mdev = 1.171/1.524/1.745/0.252 ms
```
### Pi-hole DNS resolution
```
$ dig +short arrflix.s8n.ru @192.168.0.1
82.31.156.86 ← public WAN IP, not the LAN 192.168.0.100
```
### Traefik request-log latency histogram (jellyfin@docker, 6 h, 5xx/4xx only — 200s filtered out)
```
78 0ms
8 1ms
1 3ms
1 7ms
1 18ms
1 29ms
1 39ms
1 46ms
1 92ms
1 175ms
1 192ms
1 209ms
1 222ms
1 274ms
1 294ms
1 346ms
1 391ms
1 648ms
1 1168ms
1 1256ms
1 2140ms
1 4931ms
1 8118ms
1 9543ms
```
All entries >50 ms are `/videos/.../hls1/main/*.mp4` — HLS transcode
requests with 500/499 status, AV1+HEVC at 360550 Mbit source. Edge
is not the bottleneck on those; CPU transcode is (doc 13 #02, #03).
---
## Sign-off
- Audit: 2026-05-08, read-only, ~30 min wall.
- No fixes applied. No state mutated. No container restart. No
Traefik reload. No header injected. Admin token used only for
read-side `/Items` and `/Items/.../Images` probes.
- Next audit due: **after fix #1 ships**, to confirm gzip/brotli
ratio on the actual deployed config and re-measure cold-load.

View file

@ -1,430 +0,0 @@
# 24 — Storage / Disk-I/O / Filesystem Audit (Read-Only)
> Status: **read-only audit**, executed 2026-05-08 against
> `nullstone` (192.168.0.100). Scope: storage stack underneath Jellyfin
> on `arrflix.s8n.ru`. Sibling audits cover color/HDR, server runtime,
> and edge/network — this file owns LVM, disks, ext4, mount opts, image
> cache, transcode cache, and the RO bind-mount overhead.
>
> **No writes. No mount changes. No fstrim execution. No cache
> flushes. No SMART self-tests.**
---
## Executive summary
**Storage is not the bottleneck. CPU is.** Disk I/O across every
metric came back fast and healthy. The "loads kinda slow" symptom is
almost certainly playback-stall caused by a CPU-only host running 5
concurrent ffmpeg transcodes of the same file at load average 42 — not
disk. The storage layer is in the bottom third of the suspect list.
Top three storage-side observations (severity, then quick-win order):
1. **Single PV / single LV / single NVMe — no isolation between media
reads, transcode writes, OS, and Docker overlay churn.** Severity
**Y**. Every workload hits `/dev/nvme0n1` and the ext4 journal at
`keystone--vg-home`. Today the SSD shrugs it off (2.1 GB/s direct,
1.2 GB/s through the container RO mount), but transcode-write
contention with library-scan reads is real — and the box is
currently doing 5 concurrent ffmpegs. **Quick win: nothing today;
investment: split media onto a second LV (or second device) so
transcode-write churn does not share an ext4 journal with
library-scan reads.**
2. **Read-ahead is 128 KB on the LV (`dm-4`).** Severity **Y**.
Default for sequential 1080p streams from MKV; would benefit from
**512 KB1 MB** for higher-bitrate or scanning workloads. Tiny
win, costs 30 seconds. **Quick win.**
3. **`relatime` on `/home` updates atime on the RO library (the bind
mount is RO from the container's view but the underlying ext4 is
RW from the host).** Severity **G→Y**. `relatime` is the kernel
default and only writes ~1 atime update per 24 h per file, so the
write cost on a 201-file library is rounding noise. Documented for
completeness; **not worth fixing**.
Ruled out as not-a-problem: rotating disk (it's NVMe), low free space
(62 % used, 146 GiB free — was 90 % at the prior audit, materially
better), inode pressure (6 % used), stale transcodes (zero >60 min
old), image-cache GC thrash (oldest cached image is 16 h old, no
churn), bind-mount overhead (40 % vs raw — but absolute throughput
still 12× a 4K HEVC stream needs), SSD wear (8 % used, 100 % spare,
zero media errors), and `data=ordered` journal write barriers
(NVMe-class device, irrelevant).
---
## 1. Disk + LVM topology
### Hardware
| Layer | Detail |
|---|---|
| Device | `/dev/nvme0n1`, **Intel SSDPEKKF512G8 NVMe**, 476.9 GiB, non-rotational, internal |
| Bus | NVMe |
| Loops (irrelevant) | `loop0..loop3` 256 M each (snap remnants — empty) |
Single physical drive. **No HDDs. No external storage. No NAS
mounts.** The "media on rotating media" hypothesis (a) is **ruled
out** — everything is on this NVMe.
SMART (NVMe Log 0x02):
| Field | Value |
|---|---|
| Critical Warning | `0x00` |
| Temperature | 43 °C |
| Available Spare | 100 % |
| Percentage Used | **8 %** |
| Power-On Hours | 18 597 |
| Power Cycles | 3 729 |
| Unsafe Shutdowns | 774 |
| Media + Data Integrity Errors | **0** |
| Error Log Entries | 0 |
| Data Units Read | 25.7 TB |
| Data Units Written | 25.9 TB |
Drive is healthy, mid-life. No remediation.
### Partitions and LVM
```
nvme0n1 (476.9 GiB, NVMe SSD)
├─ nvme0n1p1 976 M vfat /boot/efi
├─ nvme0n1p2 977 M ext4 /boot
└─ nvme0n1p3 475 G LVM2 PV → keystone-vg
├─ keystone--vg-root 30.4 G ext4 /
├─ keystone--vg-var 11.4 G ext4 /var
├─ keystone--vg-swap_1 24.3 G swap [SWAP]
├─ keystone--vg-tmp 2.8 G ext4 /tmp
└─ keystone--vg-home 406.2 G ext4 /home ← media + jellyfin live here
```
Single-PV VG, **VFree = 0**. Cannot grow `home` without adding
another PV. Note swap is **on the same PV** as `home`; under memory
pressure (the prior audit caught 6.8 GiB swap in use) swap traffic
contends with media reads on the same NVMe queue.
### Mount table (relevant entries only)
| Source | Mountpoint | FS | Options |
|---|---|---|---|
| `keystone--vg-root` | `/` | ext4 | `rw,relatime,errors=remount-ro` |
| `keystone--vg-var` | `/var` | ext4 | `rw,nosuid,nodev,relatime` |
| `keystone--vg-tmp` | `/tmp` | ext4 | `rw,nosuid,nodev,noexec,relatime` |
| `keystone--vg-home` | `/home` | ext4 | `rw,nosuid,nodev,**relatime**` |
| `nvme0n1p2` | `/boot` | ext4 | `rw,relatime` |
| `nvme0n1p1` | `/boot/efi` | vfat | `rw,relatime,fmask=0077,dmask=0077` |
`relatime` is the kernel default; **`atime` was not used** (good —
pure `atime` is the actual horror). `noatime` would shave ~1 atime
write per 24 h per file accessed; on a 201-file library that's
sub-noise. **Not a remediation candidate.** No `discard` flag (good
— online discard hurts performance; the weekly `fstrim.timer` is the
right pattern, see §8).
### Container bind mounts (Jellyfin)
| Host path | Container path | RW |
|---|---|---|
| `/home/docker/jellyfin/config` | `/config` | RW |
| `/home/docker/jellyfin/cache` | `/cache` | RW |
| `/home/user/media` | `/media` | **RO** |
| `/opt/docker/jellyfin/web-overrides/index.html` | `/jellyfin/jellyfin-web/index.html` | RO |
All bind mounts hit the same `keystone--vg-home` LV — config,
transcode cache, image cache, and media library all share one ext4
journal and one queue.
### ext4 features (`/dev/keystone--vg-home`)
```
Filesystem features: has_journal ext_attr resize_inode dir_index orphan_file
filetype extent 64bit flex_bg metadata_csum_seed
sparse_super large_file huge_file dir_nlink extra_isize
metadata_csum orphan_present
Default mount options: user_xattr acl
Total journal size: 1024 M (1 GiB — chunky but standard for 400 GiB)
Journal features: journal_incompat_revoke journal_64bit journal_checksum_v3
Filesystem state: clean
Last mount time: Sun May 3 23:42:28 2026
Mount count: 8
Block size: 4096
Inode count: 26 624 000
```
Journal mode is the ext4 default `data=ordered` (no override in
mountopts). On NVMe with `metadata_csum` and `journal_checksum_v3`,
this is **fine** — would only matter on slow rotational. Hypothesis
(b) "ext4 journal in `data=ordered` starves reads" is **ruled out**:
the device is NVMe-class and not the bottleneck.
---
## 2. Read throughput (1 large file, raw)
Test file: `Rick and Morty (2013) - S01E04 - M. Night Shaym-Aliens.mkv`
(1.5 GB, host path `/home/user/media/tv/...`).
| Test | Bytes | Wall | Throughput |
|---|---|---|---|
| `dd … bs=1M count=512 iflag=direct` (host, bypasses cache) | 537 MB | 0.258 s | **2.1 GB/s** |
| `dd … bs=1M count=512` (host, page-cache eligible) | 537 MB | 0.536 s | 1.0 GB/s (still warming) |
| `dd … bs=1M count=256 iflag=direct` (inside `jellyfin`, RO bind) | 268 MB | 0.233 s | **1.2 GB/s** |
**Bind-mount overhead = ~40 %** (2.1 → 1.2 GB/s). That's higher than
the "bind mounts are free" folklore but absolute throughput still
crushes any practical media bitrate (4K HDR HEVC tops out around
50 Mbit/s = 6.25 MB/s; 1.2 GB/s is **190× headroom**). **Not a
bottleneck. Not a remediation candidate.**
---
## 3. Random-read latency
`ioping` not installed on host or in container. Skipped.
Indirect signal: NVMe device-queue stats from `/proc/diskstats` for
`dm-4` (home LV):
```
reads: 15 003 996 read_sectors: 2 600 976 283 read_ms: 3 384 240
writes: 41 153 214 write_sectors: 1 997 023 232 write_ms: 145 844 732
in-flight: 0 io_ms: 5 153 616
```
Average per-read service ≈ **0.226 ms**, average per-write ≈ **3.5 ms**
(consistent with NVMe + ext4 journal flush). No queue stalls
observed.
---
## 4. Cache size breakdown
| Path | Bytes | Notes |
|---|---|---|
| `/cache` (total) | **84 MB** | Entire jellyfin cache fits in one MP3 album |
| `/cache/transcodes` | 3961 MB | Live during audit; **5 concurrent ffmpegs** (see §6) |
| `/cache/images` | 39 MB | 412 files in 16 hash-prefixed dirs |
| `/cache/images/resized-images` | 39 MB | 0 dir, 1 dir, …, f dir (16 buckets, 1830 files each) |
| `/cache/omdb` | 84 KB | Plugin response cache |
| `/cache/fontconfig` | 36 KB | |
| `/cache/attachments` | 12 KB | Subtitle/font extracts |
| `/cache/imagesbyname` | 4 KB | Empty |
Total cache = 84 MB on a 400 GB filesystem. **There is no cache
pressure.** The "cache being garbage-collected mid-page-load"
hypothesis (c) is **ruled out** (oldest cached image timestamp =
2026-05-08 01:12 BST, newest = 17:42 BST = **16.5 h retention with
no eviction**).
---
## 5. Image cache miss-vs-hit timing
Public asset latency from onyx → `https://arrflix.s8n.ru`:
| URL | Attempt 1 (cold) | Attempt 2 (warm) |
|---|---|---|
| `/web/assets/img/icon-transparent.png` | 0.227 s | 0.047 s |
| `/web/serviceworker.js` | 0.059 s | 0.059 s |
| `/web/main.jellyfin.bundle.js` | 0.092 s | 0.052 s |
5-sample steady state on `/web/main.jellyfin.bundle.js` = **4468 ms,
median 49 ms**. Traefik + Jellyfin static-asset path is fast.
Direct poster URLs (`/Items/{id}/Images/Primary`) require an auth
token; could not be probed without a fresh `X-Emby-Token`. Inferred
from on-disk evidence: the `resized-images` cache contains 412
WebPs, all under 200 KB, no eviction in the last 16 h. **Image cache
serves all current items from disk on warm path.**
Hypothesis (c) is **ruled out**.
---
## 6. Stale-transcode detection
```
/cache/transcodes:
total bytes: 39 MB (was 61 MB earlier in audit, churn = active stream)
total files: 26
files >60 min old: 0
bytes >60 min old: 0 MB
```
`Clean Transcode Directory` task last ran `2026-05-08T02:13` (per
audit 13 task list). **Currently zero stale transcode segments.**
Hypothesis (d) is **ruled out** — no accumulation.
However, **5 concurrent ffmpeg processes are transcoding the same
file** right now:
```
PID CPU file
1685478 246% Rick and Morty S01E01 - Pilot.mkv
1686665 203% Rick and Morty S01E01 - Pilot.mkv (same file)
1686651 198% Rick and Morty S01E01 - Pilot.mkv (same file)
1689000 125% Rick and Morty S01E01 - Pilot.mkv (same file)
1689109 120% Rick and Morty S01E01 - Pilot.mkv (same file)
```
This is a **CPU-side** issue (no ffmpeg de-dup, no segment
throttling — see audit 13 finding 03). It causes:
- Load average **42.62 / 22.84 / 12.32** (12-core box).
- Swap usage 7.8 GiB / 24 GiB.
- I/O wait however is **0 %** in `vmstat` (`wa=0`).
The host CPU is saturated, not the disk. **Storage layer is not
this user's bottleneck.**
---
## 7. Inode + free-space stats
| Filesystem | 1K-blocks | Used | Available | Use % | Inodes | IUsed | IUse % |
|---|---|---|---|---|---|---|---|
| `keystone--vg-home` (`/home`) | 418 106 320 | 244 025 392 | 152 768 828 | **62 %** | 26 624 000 | 1 489 612 | **6 %** |
| `keystone--vg-root` (`/`) | — | — | — | — | — | — | — |
| `keystone--vg-var` (`/var`) | 12 G | 2.0 G | 8.6 G | **19 %** | n/a | n/a | n/a |
**Free space went from 40 GiB at audit 13 (90 % full) to 146 GiB now
(62 %).** Material improvement; the prior "low free space"
hypothesis (e) is **ruled out**. Inode pressure ruled out.
(Note: `/home` houses `/home/user/docker-data/100000.100000/...`
which contains all userns-remapped Docker overlay2 trees. The 233 G
used number includes container layers, not just media. Library
itself is 201 files.)
---
## 8. fstrim status
```
fstrim.timer Loaded, enabled, active (waiting)
Last triggered: Sun 2026-05-03 23:42:29 BST
Next trigger: Mon 2026-05-11 01:12:58 BST
fstrim --dry-run /home → /home: 0 B (dry run) trimmed
```
Weekly trim is configured and recently ran (one week before next
trigger). **Dry-run reports 0 B candidate** → there is no untrimmed
free space on `/home`. SSD performance degradation from
unTRIMmed-blocks is **not** a factor. No `discard` mount option
(correct — async batched trim via timer is preferred over inline).
---
## 9. Read-ahead and queue settings
| Block device | `read_ahead_kb` | scheduler | `nr_requests` |
|---|---|---|---|
| `nvme0n1` (physical) | **128 KB** | `[none] mq-deadline` | 1023 |
| `dm-4` (`keystone--vg-home`, the LV) | **128 KB** | n/a | n/a |
| `/sys/block/dm-4` lacks scheduler/nr_requests (dm devices inherit) |
128 KB read-ahead is the kernel default. For sequential MKV streams
this is OK; for library-scan workloads (`stat` + open + read first
chunk per file) it's also OK. Bumping to 512 KB or 1024 KB would
help **scan throughput** during a Jellyfin library refresh — minor
win, ~30 s of work.
NVMe is using `none` scheduler (correct for NVMe — multiqueue + no
elevator).
---
## 10. RO bind-mount overhead — confirmed
(From §2.) Host direct = 2.1 GB/s. Container RO bind = 1.2 GB/s.
Overhead ≈ 40 % which is higher than expected, likely a side-effect
of:
- userns remap (`100000.100000` shifts uids)
- the `nosuid,nodev` flags on `/home` propagating into the bind
- container's `read_ahead_kb` is **not** configurable through bind
(inherits 128 KB)
**Not actionable today.** Both numbers are 100×+ of any media
bitrate. Documented to rule out hypothesis (f).
`atime` cost on RO bind: bind mount inherits the host's `relatime`
semantics — at most one atime write per file per 24 h, gated by
`relatime`. On 201 files that's ≤ 201 atime writes/day = **rounding
noise**. Hypothesis (f) **ruled out**.
---
## 11. Concrete remediation list — ranked
Severity legend: **R** = red (acute, fix this week), **Y** = yellow
(deferred, document risk), **G** = green (audited, healthy, no
action). Effort: **S** ≤ 30 min, **M** half-day, **L** > 1 day.
| # | Severity | Effort | Bucket | Action | Why |
|---|:-:|:-:|---|---|---|
| S01 | Y | S | Quick-win | Bump `read_ahead_kb` on `/dev/nvme0n1` to **512 KB** (sysfs or udev rule) | Helps library-scan and large-MKV streams. Tiny risk; reverts on reboot if set live. |
| S02 | Y | M | Quick-win | Add `noatime` (replacing `relatime`) to `/home` mount in `/etc/fstab` | Eliminates the residual `relatime` writes; cosmetic but cheap. Requires a remount; do during a window with no playback. |
| S03 | Y | M | Investment | Carve a separate **`media` LV** (or attach a second NVMe) for `/home/user/media` and bind-mount it RO into Jellyfin | Isolates library reads from transcode-write churn and Docker overlay churn on the same ext4 journal. Today it is fine; at scale it will not be. |
| S04 | Y | M | Investment | Move `keystone--vg-swap_1` off `keystone-vg` (or onto a separate device) | Swap is currently 7.8 GiB used and shares the NVMe queue with media reads. CPU saturation is the proximate cause but cleanly isolating swap helps when CPU finally gets fixed (GPU re-enable, see audit 13 #02). |
| S05 | Y | M | Investment | Add a second PV to `keystone-vg` so the VG has free space | `vgs` shows **VFree=0**. Any future `lvextend` will fail until a PV is added. Latent ops trap. |
| S06 | G | — | — | Keep weekly `fstrim.timer` as-is | Healthy, current. |
| S07 | G | — | — | Keep image cache untouched | 84 MB total cache, 16 h retention, no GC pressure. |
| S08 | G | — | — | No change to `data=ordered` ext4 journal | NVMe; mode is fine. |
**The single biggest "loads kinda slow" win lives in audit 13
(finding 03 — enable transcode throttling + segment deletion).
Storage is not where this is fixed.**
---
## 12. Quick-win vs investment
### Quick-win (≤30 min total, today)
- **S01**`echo 1024 > /sys/block/nvme0n1/queue/read_ahead_kb` (or
512). Reverts on reboot; persist via udev rule under
`/etc/udev/rules.d/60-readahead.rules`. Marginal but free.
- **S02** — flip `relatime``noatime` in `/etc/fstab` for
`/home`. Cosmetic but cheap. **Skip if even half-load** — a
bad fstab + reboot is an outage; only do during a planned
window.
### Investment (half-day to multi-day, plan)
- **S03** — separate `media` LV. Requires `lvcreate`, `mkfs`, rsync
the library, swap the bind-mount in compose. ~half-day. Pays back
when (a) library grows past the current 201 files, (b) GPU
transcode is re-enabled (audit 13 #02) and many concurrent reads
start happening.
- **S04** — relocate swap. Only meaningful after GPU re-enable
closes the CPU-saturation root cause.
- **S05** — second PV. Trivial mechanically (`pvcreate`, `vgextend`),
blocked on having a second device. Defer until needed.
### No-op (audited and healthy)
- SMART status (8 % wear, no errors)
- ext4 features and journal mode
- Inode usage (6 %)
- Free space (62 %, 146 GiB headroom)
- Cache size (84 MB total)
- Stale transcodes (zero)
- `fstrim.timer` (working, candidate-bytes = 0)
- Bind-mount throughput (1.2 GB/s, 190× any 4K stream)
---
## 13. Sign-off
- Audit: 2026-05-08, read-only, ~15 min wall.
- No fixes applied. No state mutated. No container restart. No SMART
self-test. No fstrim execution. No mount changes.
- **Top storage culprit: none.** Storage stack is healthy. The
"loads kinda slow" symptom is CPU-side (5 concurrent ffmpegs at
load 42, audit 13 #02 + #03).
- **Top quick-win: S01 — bump `read_ahead_kb` to 512 KB on
`nvme0n1`** for marginal scan/stream gain. Real fix lives in
audit 13.
- Next audit due: **2026-08-08** (quarterly, with audit 13).

View file

@ -1,373 +0,0 @@
# 25 - English Leak Deep-Dive (Post-Lockdown "Abspielen" Persistence)
> Investigation triggered after the 2026-05-08 multi-agent English-only
> lockdown sweep landed (server-wide UICulture, per-user UICulture for 9/9,
> DisplayPreferences CustomPrefs.language for 32 entries, web shim with
> `navigator.language` + localStorage + Accept-Language strip + CSS hide of
> language switchers). Operator hard-killed Trivalent (cache + LS + SW
> wiped) and restarted, yet the Play button STILL renders **"Abspielen"**.
> Audio + subtitle preferences correctly render English (proof the per-user
> preference layer IS landing for non-UI surfaces).
Date: 2026-05-08
Investigator: deep-dive sibling agent
Mode: read-only on Jellyfin live state, read-only on container, no
restarts, no shim modifications. Headless-Chromium reproductions used to
prove behaviour rather than theorise.
Prior reading (do not repeat findings from):
`docs/15-force-english.md`, `docs/19-english-only-audit.md`,
`docs/20-english-only-lockdown.md`, `docs/22-jellyfin-runtime-perf-audit.md`.
---
## 1. Executive Summary — actual root cause, with proof
The multi-layer lockdown's **per-user `Configuration.UICulture` pin is
inert with respect to the web SPA's UI-string locale**. The web SPA's
`jellyfin-web` bundle does not read `Configuration.UICulture` from the
authenticated user object at all — that field is referenced in exactly two
chunks (`wizard-start.<hash>.chunk.js` and `25583.<hash>.chunk.js`), both
of which are admin **dashboard** forms for the SERVER-WIDE UICulture (the
"Display Language" admin setting), and neither is loaded on a normal user
session. Verified live:
```
$ docker exec jellyfin grep -lE "UICulture" /jellyfin/jellyfin-web/*.js
/jellyfin/jellyfin-web/index.html # ARRFLIX shim text only
$ docker exec jellyfin grep -lE "UICulture" /jellyfin/jellyfin-web/*.chunk.js
/jellyfin/jellyfin-web/25583.95a80bf8834e61a9a8e4.chunk.js
/jellyfin/jellyfin-web/wizard-start.a4dfcf169516d40c4e52.chunk.js
$ docker exec jellyfin grep -oE ".{40}UICulture.{60}" \
/jellyfin/jellyfin-web/wizard-start.a4dfcf169516d40c4e52.chunk.js
up/Configuration")).then((function(n){n.UICulture=$("#selectLocalizationLanguage",t).val(),
e.ajax({type:"POST...
```
Both occurrences are POSTs to `/System/Configuration` (the server-wide
dashboard form), not reads from `/Users/{id}.Configuration`.
**The SPA's actual locale resolver** (decompiled from
`main.jellyfin.bundle.js`) is:
```js
function g(){
return document.documentElement.getAttribute("data-culture")
|| (navigator.language ? navigator.language
: navigator.userLanguage ? navigator.userLanguage
: (navigator.languages?.length) ? navigator.languages[0]
: "en-us");
}
function w(){
var e;
try { e = i.currentSettings.language() } // localStorage.getItem("language")
catch(e){ }
b(e = e || g());
l = S(e); // S = lowercase + replace _ with -
document.documentElement.setAttribute("lang", l);
...
}
```
`i.currentSettings.language()` reads `localStorage.getItem("language")`
(no user prefix — verified via `key:"language"` lookup with `t=false`
prefix flag in the Settings.get implementation). Per-user
`Configuration.UICulture` is never copied into this localStorage key by
any code path in the bundle.
**The ARRFLIX shim is the ONLY layer that actually pins the SPA UI
language**, by overriding `Navigator.prototype.language`,
`Navigator.prototype.languages`, and pre-seeding `localStorage.language`
to `en-US`. Headless Trivalent reproductions with explicit
`--lang=de-DE --accept-lang=de-DE,de,en` confirm the shim works correctly:
```
$ trivalent --headless=new --lang=de-DE --accept-lang=de-DE,de,en \
--user-data-dir=/tmp/clean-profile --enable-logging=stderr --v=1 \
https://arrflix.s8n.ru/web/index.html
$ grep -E "json.*chunk.js" /tmp/headless.log
...NotifyBeforeURLRequest: https://arrflix.s8n.ru/web/en-us-json.667484b4a441712c7e05.chunk.js
$ grep "<html" /tmp/headless.dump
<html class="preload layout-desktop" dir="ltr" lang="en-us">
```
So the shim produces the correct chunk request and the correct `<html
lang>` attribute when running against a freshly-isolated browser profile
that never hit arrflix.s8n.ru before. The de-json chunk is **never**
fetched in this scenario.
**Therefore the operator's persistent "Abspielen" is not a leak in any
server-side or shim-side layer — it is stale browser-side state that
predates the shim deploy (web-overrides/index.html mtime
2026-05-08 17:22:00) and survived the operator's wipe.** The candidate
stale-state vectors, in order of likelihood:
1. **Stale `index.html` in HTTP disk cache.** The server emits **no
`Cache-Control` header** on `/web/index.html`; `last-modified` is
`Fri, 08 May 2026 16:22:00 GMT`. Per RFC 7234 §4.2.2, Chromium
heuristically caches at `0.1 * (now Last-Modified)` ≈ 24 minutes
when an asset has no Cache-Control. If the operator hit `/web/`
between the shim deploy at 17:22 and the wipe attempt, then for the
~24-minute heuristic window the OLD pre-shim index.html could
reload from disk on subsequent visits without a network round-trip.
Mullvad-style "clear site data" wipes (cookies, LS, IndexedDB, SW)
do NOT always include the HTTP cache for the eTLD+1 — Chromium's
`chrome://settings/clearBrowserData` exposes "Cached images and
files" as a separate checkbox from "Cookies and other site data",
and a partial-wipe would leave a stale shim-less index.html in
place. The operator's reported wipe path matters here — if it was
"Clear site data" via DevTools (LS/SW only) or a Mullvad-style
per-origin wipe scoped to storage but not cache, the index.html
survives.
2. **A second browser profile / second browser the operator forgot.**
The operator has multiple Chromium-family browsers installed
(`~/.var/app/{com.google.Chrome, com.google.ChromeDev,
org.chromium.Chromium, io.github.ungoogled_software.ungoogled_chromium,
net.mullvad.MullvadBrowser}`) plus Trivalent at
`~/.config/trivalent`. Trivalent's pref `selected_languages` is
`en-GB,en-US,en` (not German), so the German rendering must come
from a browser the operator hasn't wiped. A profile that hit
arrflix.s8n.ru weeks ago, when no shim was in place, with a
`de-*` Accept-Language at the time, would have its in-place
localStorage `language=de` (set by the SPA's settings persistence
on first load) AND its on-disk index.html cache predating the shim.
3. **The operator's screenshot was captured BEFORE the wipe.** "I
wiped and still see Abspielen" can be the operator restating an
older screenshot rather than reproducing a fresh one. Verifiable
only by asking the operator to take a new screenshot post-wipe.
**Severity: LOW.** The shim is functioning correctly; this is a
deploy-day-only stale-cache window. The clean fix is two lines of
docker-compose / Traefik headers config to set proper `Cache-Control`
on `/web/index.html` so future shim deploys propagate without manual
operator wiping.
---
## 2. Per-Hypothesis Verdicts
| # | Hypothesis | Verdict | Probe + evidence |
|---|---|---|---|
| 1 | `<link rel="prefetch">` for the de chunk fires before the inline shim runs | **Ruled out** | `grep -ciE 'rel="?(prefetch\|preload\|modulepreload)' index_de.html``0`. The bundle uses `<script defer>`, which executes after parsing, AFTER the inline shim has already run. The shim is the first executable JS in the document. |
| 2 | Service Worker pre-cached the de chunk | **Ruled out** | `serviceworker.js` is 768 bytes and contains only a `notificationclick` handler + `clients.claim()`. No `fetch` event listener, no precache, no cache.put. Cannot intercept locale chunk loads. Source: `curl https://arrflix.s8n.ru/web/serviceworker.js`. |
| 3 | Bundle reads `<html lang>` attr on first paint | **Ruled out** | The bundle's locale resolver `g()` reads `document.documentElement.getAttribute("data-culture")` — NOT `lang`. The `lang` attribute is *set* by the bundle (via `document.documentElement.setAttribute("lang", l)` in `w()`), it is not read. The served HTML opens with `<html class="preload" dir="ltr">` — no `data-culture`, no `lang`. |
| 4 | Bundle reads `document.cookie` for locale | **Ruled out** | `grep -ciE 'document.cookie.{0,40}lang\|locale\|culture' main.bundle.js``0`. No cookie-based locale path in any bundle. |
| 5 | Hard-coded `de-DE` fallback in the bundle | **Ruled out** | The hard-coded fallback is `var f="en-us"` (decompiled from `main.bundle.js`) — used when `navigator.language`, `navigator.languages`, `navigator.userLanguage`, and `data-culture` are all absent. Falls back to English, not German. |
| 6 | Server sends `Content-Language: de` | **Ruled out** | `curl -sI https://arrflix.s8n.ru/web/index.html` returns no `Content-Language` header. |
| 7 | Traefik/upstream content-negotiates locale (`Vary: Accept-Language`) | **Ruled out** | `curl -sI` returns no `Vary` header. Both `Accept-Language: de-DE,de;q=0.9,en;q=0.5` and `Accept-Language: en-US` return byte-identical 65485-byte HTML (same etag `1dcdf06cc053bcd`). Confirmed via `diff -q` of two captures. |
| 8 | Per-user `DisplayPreferences.CustomPrefs.language` writes the wrong key | **Inconclusive (irrelevant)** | DisplayPreferences is read by the bundle for `chromecastVersion`, `dashboardTheme`, home-section ordering, etc. — not for UI locale. The locale-related code paths (`g()`, `w()`, `i.currentSettings.language()`) read from `Navigator.prototype.language`, `data-culture`, and `localStorage.getItem("language")` only. Per-user DisplayPreferences.CustomPrefs.language could be set to anything and the SPA UI would not change. The 32 entries written by sibling A2 are a no-op for the Abspielen bug. |
| 9 | Cineplex theme injects German strings via CSS `content:` | **Ruled out** | `grep -ciE 'content:.{0,80}(Abspielen\|Fortsetzen\|Anzeigen)' /opt/jellyfin/config/branding/*.css 2>&1` returns 0. Themes are CSS-only and Jellyfin's branding `CustomCss` is plain CSS, not capable of localising button labels. |
| 10 | Plugin contributes the Play string | **Ruled out** | `GET /Plugins` lists 6 plugins (AudioDB, MusicBrainz, OMDb, Open Subtitles, Studio Images, TMDb). All are metadata-source plugins with server-side string surfaces only; none ship web-bundle UI strings. Verified by inspecting each plugin's `Plugin.{xml,json}` for `web/` or `client/` resources — none. |
| 11 | Pre-auth chunk request races the shim | **Ruled out by reproduction** | Headless Trivalent run with `--lang=de-DE --accept-lang=de-DE,de,en` against a freshly-created `--user-data-dir`, capturing the full network log: the ONLY locale chunk requested is `en-us-json.667484b4a441712c7e05.chunk.js`. The de chunk URL is never touched. The shim's `Object.defineProperty(Navigator.prototype, 'language', …)` runs synchronously during HTML parsing (inline non-defer script), before any deferred bundle script executes (HTML5 spec §4.12.1 — defer scripts execute after parsing in document order; inline scripts execute when the parser reaches them). The locale resolver `g()` runs inside the deferred bundle, so the override is in effect by the time `g()` is called. |
| 12 | Browser sends `Accept-Language: de-DE` and the SPA reads it via fetch echo | **Ruled out** | The SPA's locale resolver does NOT make any pre-bundle network request to read its own Accept-Language. The resolver is purely synchronous and only reads `navigator.language`, `navigator.userLanguage`, `navigator.languages[0]`, plus the `data-culture` DOM attr and `localStorage.language`. Confirmed by full-text `grep -ciE 'fetch.{0,200}accept.language\|XMLHttpRequest.{0,200}accept.language' main.bundle.js` → matches only the ARRFLIX shim's STRIP code, no read. |
---
## 3. Concrete remediation, ranked by blast radius
### R1 — Add `Cache-Control: no-cache` on `/web/index.html` (Traefik header) — RECOMMENDED FIRST
Smallest blast radius. Forces every browser to revalidate the index.html
on every visit, so future shim updates propagate within one tab refresh
instead of a 1055 minute heuristic-cache window.
```yaml
# In /opt/docker/jellyfin/docker-compose.yml under the jellyfin service labels:
- "traefik.http.routers.jellyfin.middlewares=jellyfin-nocache-html@docker"
- "traefik.http.middlewares.jellyfin-nocache-html.headers.customresponseheaders.Cache-Control=no-cache, must-revalidate"
```
**Caveat:** this header would be applied to ALL responses on the
`jellyfin` router, including the immutable hashed chunk files. Chunks
SHOULD remain cacheable forever (they're hash-fingerprinted). Therefore
either:
- **Path A (simpler):** apply `no-cache` only to `/web/index.html` via
a path-scoped middleware, leaving everything else alone:
```yaml
- "traefik.http.middlewares.jellyfin-nocache-html.headers.customresponseheaders.Cache-Control=no-cache, must-revalidate"
- "traefik.http.routers.jellyfin-html.rule=Host(`arrflix.s8n.ru`) && Path(`/web/index.html`)"
- "traefik.http.routers.jellyfin-html.middlewares=jellyfin-nocache-html@docker"
- "traefik.http.routers.jellyfin-html.priority=100" # higher than the catch-all
- "traefik.http.routers.jellyfin-html.service=jellyfin@docker"
```
- **Path B (cleaner, better long-term):** apply `Cache-Control:
public, max-age=31536000, immutable` to all `/web/*.{js,css,chunk.js}`
(which Jellyfin upstream already fingerprints) AND `Cache-Control:
no-cache, must-revalidate` to `/web/index.html` and `/web/manifest.json`.
This is the conventional SPA cache strategy; we get the best of both
worlds (instant chunk load + always-fresh shim).
**Do NOT install yet** — operator decision required on Path A vs B.
### R2 — Operator-side: document the precise wipe procedure for shim updates
Add to `docs/20-english-only-lockdown.md` "Re-apply procedure" section:
> When updating the web shim (`web-overrides/index.html` or any file
> bind-mounted into `/jellyfin/jellyfin-web/`), every active browser
> session must be wiped with **"Clear browsing data" → tick BOTH
> "Cookies and other site data" AND "Cached images and files"** for
> the `arrflix.s8n.ru` origin. DevTools "Storage → Clear site data"
> alone does NOT clear HTTP disk cache in all Chromium variants; the
> all-time wipe via `chrome://settings/clearBrowserData` is required.
This closes the operator-process gap that left a stale index.html in
the operator's browser.
### R3 — Stop investing in per-user `Configuration.UICulture` POSTs
Per the proof in §1, this field has no effect on the web SPA's UI
language. It controls only:
- The user object the API returns (so the dashboard form for "edit
user" displays the correct value if anyone ever opens it).
- Server-side string surfaces that DO honour per-user culture
(Live TV EPG metadata for the API caller, some plugin responses),
but NOT the web client UI strings.
Keep `bin/force-english-all-users.sh` and the
`bin/add-jellyfin-user.sh` UICulture line for cosmetic consistency
and future-proofing (Jellyfin upstream might wire it up someday), but
**stop expecting it to fix UI-string leaks**. The shim is the only
thing pinning the UI.
### R4 — Defense-in-depth: bind-mount empty `de.json` chunk stubs
Doc 19 §"Files to Delete" (Path B) proposed this. Still valid as a
belt for Path R1, but high-maintenance (chunk hashes rotate on every
Jellyfin upgrade — currently `de-json.1afccc006ab8bb6c5953.chunk.js`
but a `jellyfin/jellyfin` image bump could change it). **Defer
indefinitely** unless the operator wants the German strings physically
unreachable for paranoia.
### R5 — `Accept-Language` rewrite at Traefik (doc 19 §"Path A — Traefik middleware")
```yaml
- "traefik.http.middlewares.arrflix-lang.headers.customrequestheaders.Accept-Language=en-US,en;q=0.9"
- "traefik.http.routers.jellyfin.middlewares=arrflix-lang"
```
**Useful but redundant** with the existing shim. The shim already
strips Accept-Language on outbound fetch/XHR (verified live in shim
source). The browser-issued INITIAL request to `/web/index.html` is
the only one that would ever carry Accept-Language, and the index.html
is byte-identical regardless of header (proven in hypothesis 7). So
this rewrite would prevent **future** Jellyfin upstream behaviour
changes that start using Accept-Language on the index.html response,
but doesn't fix anything currently broken.
---
## 4. Why prior audits (15, 19, 20) missed this
Doc 15 correctly diagnosed that the SPA "falls back to
`Accept-Language` when `UICulture` is unset" — a CONJECTURE based on
observing that German appeared and that `Configuration.UICulture` was
absent on every user. The conjecture was never tested by GREPPING THE
WEB BUNDLE for `UICulture`, which would have shown immediately that
the SPA never reads it. Doc 19 inherited the conjecture verbatim. Doc
20 codified it into the lockdown procedure. Three audits in a row,
all assuming a causal link that doesn't exist.
The actual causal layer (`Navigator.prototype.language` →
`<lang>-json` chunk selection) was correctly identified in doc 19's
§"Layer 18" remediation suggestion — which is what shipped as the
shim. The shim is the fix; the per-user UICulture pin is theatre.
After this deep dive, doc 20's "Layer 2" (per-user) and "Layer 1"
(server-wide) sections should be re-labelled as "metadata-affecting"
rather than "UI-affecting", and doc 19's primary-fix table-row should
be flipped from "Layer 5 (per-user UICulture) is the biggest impact"
to "Layer 18 (navigator.language shim) is the only impact on UI
strings".
A subtler lesson: when doc 19 said "all 8 users have `UICulture` absent
→ that's why German leaks", the audit *also* noted (table row 16)
that 92 non-English locale chunks are reachable and contain
`"Play":"Abspielen"` etc. That observation alone, combined with the
chunk-loading code, would have shown the actual mechanism. The fix
was correctly proposed (shim with `navigator.language` override),
but the diagnosis text emphasised the wrong layer.
---
## 5. New hypotheses uncovered during probe
### H13 — `index.html` is served with NO `Cache-Control` header
Already covered above (R1). Not a leak per se but the mechanism by which
ANY future shim deploy can fail to propagate without operator wiping.
Critical to fix before the next shim iteration to avoid this whole
"why is it still German" dance recurring.
### H14 — Operator may have multiple browser profiles/binaries with stale state
Operator has Trivalent (`~/.config/trivalent`), Chromium
(`~/.config/chromium` with profile `Default` and `Profile 1`),
`~/.config/google-chrome-for-testing`, plus Flatpak: Chrome,
ChromeDev, Chromium, ungoogled-Chromium, MullvadBrowser. Eight
Chromium-family installs. A wipe of "the browser" plausibly missed at
least one. **Probe to ask operator:** which exact browser binary +
which exact profile produced the screenshot? "Trivalent default
profile with all storage wiped including HTTP cache" yields a
different conclusion from "Mullvad ad-hoc surgical wipe targeting
storage only".
### H15 — Per-user `Configuration.UICulture` lockdown layer is doing literal nothing
Documented in R3. Worth flagging because the op cost (running
`bin/english-lockdown-runner.sh` weekly via systemd timer) is nonzero
and we now know the only layer that matters is the shim, which
doesn't need re-running because it's a static bind-mount.
### H16 — Chunk URLs in the require.context are immutable per Jellyfin upgrade
Verified: `n(73125)` maps `./en-us.json``[20233, 79754]` and
`./de.json``[99810, 89409]`. These ID pairs are baked into the
runtime.bundle.js at Jellyfin build time. So a Jellyfin image upgrade
WILL change the chunk hashes (filename, e.g.
`de-json.<hash>.chunk.js`) but NOT the chunk-id mapping in
runtime.bundle.js. Any defense-in-depth via 1-byte stub bind-mounts
(R4) MUST be regenerated after every image bump — not just the
filename, but if the chunk-id stub-content `(self.webpackChunk = …)
.push([[CHUNK_ID], {}])` also depends on the chunk-id, then those
lines need re-emission too. Treat as part of the upgrade runbook,
not a one-shot install.
### H17 — `localStorage.language` is shared across all Jellyfin users on the same browser
The settings store reads `localStorage.getItem("language")` with NO
user-prefix when the prefix flag is `false` (verified in `key:
"language"` getter signature with `t = false`). All other
preferences are stored as `<userId>-<key>`, but `language` is
specifically global. So if user A on a browser with German pref
loads the SPA pre-shim and the SPA writes `localStorage.language =
"de"` into the user-settings store, then user B on the same browser
inherits the German preference until either the shim runs (which
overwrites it on every load) or the storage is wiped. The shim's
`pinLocale()` belt re-pins on every visibility change, so this isn't
exploitable, but it IS the ONE persistence mechanism that survives
both server-side UICulture pinning and per-user-DisplayPreferences
writes.
---
## Sign-off
- **Mode:** read-only on Jellyfin live state (no POST/PATCH/PUT).
Read-only on container (zero `docker exec` writes). Shim file
unchanged. Headless Trivalent test runs used `/tmp/eng-deep-dive/cprofile{,2}`
isolated profiles, no production browser state touched.
- **Live evidence captures:**
- `/tmp/eng-deep-dive/index_de.html` (65485 bytes, has shim block)
- `/tmp/eng-deep-dive/runtime.bundle.js` + `main.bundle.js` +
`37869.bundle.js` (decompiled to confirm locale-resolver code path)
- `/tmp/eng-deep-dive/headless2.log` (only en-us-json chunk
requested under explicit German Accept-Language)
- **Recommendation order:** R1 (Cache-Control no-cache on index.html)
→ R2 (document wipe procedure) → R3 (stop investing in per-user
UICulture for UI). R4 and R5 are optional defense-in-depth, not
required to fix the screenshot.
- **Next-action owner:** operator decides Path A vs B for R1; web
agent then applies the chosen Traefik label diff in a single commit.
- **Severity:** LOW — shim is functioning, this is a stale-cache
process gap, not a continuous leak.

File diff suppressed because it is too large Load diff

View file

@ -1,132 +0,0 @@
# 27 — ARRFLIX status snapshot (2026-05-09 02:15 UTC)
Point-in-time visual status after doc-26 incident. For ongoing roadmap see
`ROADMAP.md`. For incident detail see `docs/26-incident-2026-05-09-...md`.
```
┌─────────────────────────────────────────────────────────────────┐
│ ARRFLIX arrflix.s8n.ru · Jellyfin 10.10.3 · nullstone │
│ HEAD e1720e3 @ git.s8n.ru/s8n/ARRFLIX · 20Mbps cap · 12 users │
└─────────────────────────────────────────────────────────────────┘
```
## Symptoms killed this session (8/8)
```
[✓] Page Unresponsive INC1 index.html drift revert
[✓] No previews INC1 :has() transparent-scope
[✓] Posters black INC1
[✓] Abspielen German INC1 Cineplex CSS content: override
[✓] Backdrops black INC1+INC2+INC3 pin :fixed + sub-section transparent
[✓] Black band carousels INC4 .emby-scroller transparent
[✓] Slow first-frame (4K HDR) INC4 EnableTonemapping=false + 20Mbps cap
[✓] Grey scrollbar strip INC5 ::-webkit-scrollbar themed
[✓] MNS AV1 black INC5 re-encode H.264/AAC sources
[~] MNS fmp4-HLS black again INC6 Clear-Site-Data:"cache" — verify pending
```
## Roadmap
```
┌─ DONE this session ──────────────────────────────────────────────┐
│ ✓ docs/26 incident post-mortem (1500+ lines, 5 iterations) │
│ ✓ bin/headless-test.py + headless-test-v2.py (multi-user+Play) │
│ ✓ bin/apply-26-incident-fixes.sh (idempotent re-apply INC1-5) │
│ ✓ web-overrides/index.html INC5 fmp4=false shim │
│ ✓ branding.xml INC1-5 CustomCss patches │
│ ✓ encoding.xml throttling+segdeletion+tonemapping all off │
│ ✓ 12 user policies @20Mbps cap │
│ ✓ MNS S1E2/E4/E5 AV1→H.264 re-encode (originals @ /tmp .bak) │
│ ✓ 18-item don't-repeat checklist │
└──────────────────────────────────────────────────────────────────┘
┌─ PENDING verification ───────────────────────────────────────────┐
│ ⧗ INC6 Clear-Site-Data wipes user cache → fresh shim → MNS plays │
│ then: remove clear-cache-only middleware │
└──────────────────────────────────────────────────────────────────┘
┌─ HIGH-VALUE OPEN (next session) ─────────────────────────────────┐
│ H1 OpenSubtitles creds (owner sign up at .com) │
│ H2 GPU transcode (nvidia driver + container toolkit + SecureBoot)│
│ → unlocks 4K HDR realtime instead of 0.5x │
│ H3 Off-host backup of /home/docker/jellyfin/config │
└──────────────────────────────────────────────────────────────────┘
┌─ MEDIUM-VALUE OPEN ──────────────────────────────────────────────┐
│ M1 Library AV1 sweep + Sonarr/Radarr penalty so future grabs │
│ don't re-trigger jellyfin#15646 │
│ M2 4K HDR pre-transcode batch (R&M masters → 1080p H.264 SDR) │
│ M3 v2 test allowlist: filter off-viewport (#reactRoot y=-490 │
│ and .mainDrawer x=-320 false-positives) │
│ M4 Promote /tmp/*-av1-original-*.mkv.bak to real archive dir │
│ M5 Per-library themes (Movies=Netflix, Anime=Crunchy, Music=Spo)│
│ M6 PWA manifest bind-mount (kill "Jellyfin" name on Android) │
└──────────────────────────────────────────────────────────────────┘
┌─ DEFERRED (with reason) ─────────────────────────────────────────┐
│ ⊘ Pixel-perfect Netflix/Crunchy/Spotify per-library — needs 3 │
│ separate Jellyfin instances, ~100x maintenance │
│ ⊘ Custom Jellyfin Docker image — bind-mount works │
│ ⊘ 4 TB HDD activation — wait for library > 500 GB │
│ ⊘ Jellyfin-Vue web client — would replace whole UI │
└──────────────────────────────────────────────────────────────────┘
┌─ STRATEGIC (separate planned migrations) ────────────────────────┐
│ ⚑ 10.11.8 upgrade (CVE coverage + TMDB scrape #14922 fix) │
│ Plan: dev first, EF Core DB migration snapshot, theme swap │
│ Cineplex→ElegantFin (10.11 supported), promote prod │
│ ⚑ FlexHub/Forgejo CI: lint compose, shellcheck bin/, render docs │
│ ⚑ ARRFLIX wordmark high-res for splash (currently 235x85 soft) │
└──────────────────────────────────────────────────────────────────┘
```
## Library
```
TV eps codec DirectPlay
Futurama S1-S4 72+9 1080p HEVC transcode→x264
American Dad S1-S4 58 1080p ✓
Rick&Morty S1 11 4K HEVC HDR transcode (slow until M2)
Maul S1 10 1080p ✓
Obi-Wan S1 6+4 1080p ✓
Mike Nolan S1 2-5 1080p H.264 ✓ (just re-encoded INC5)
Mandalorian S1-S3 18/24 - scrape in flight
Movies
Dark Knight 2008 4K HEVC HDR transcode
Hulk 2008 1080p ✓
Idiocracy 2006 1080p ✓
```
## Files in repo
```
ARRFLIX/
├── docker-compose.yml ← jellyfin/jellyfin:10.10.3 + Traefik labels
├── compose-dev/ ← jellyfin-dev sibling
├── web-overrides/
│ ├── index.html ← INC5 enableHlsFmp4=false shim + ARRFLIX brand
│ └── ENGLISH-LOCKDOWN.md
├── bin/
│ ├── add-jellyfin-user.sh ← canonical user creation
│ ├── apply-26-incident-fixes.sh ← idempotent INC1-5 re-apply ★ NEW
│ ├── force-english-all-users.sh ← (now superseded by Cineplex CSS fix)
│ ├── headless-test.py ← v1 smoke test
│ ├── headless-test-v2.py ← v2 multi-user+click-play+bg-sweep ★ NEW
│ └── inject-shim.py
├── docs/
│ ├── 00-overview.md
│ ├── 01..25-*.md ← prior audits + research docs
│ ├── 26-incident-2026-05-09-page-unresponsive-and-playback.md ★ NEW
│ └── 27-status-snapshot-2026-05-09.md ★ THIS DOC
├── ADMIN-GUIDE.md
├── ROADMAP.md
└── README.md
```
## Next click
```
1. Hard-reload browser → MNS S1E4 → confirm plays
2. Tell me: works → I remove INC6 Clear-Site-Data middleware
3. Plan B: 10.11.8 + ElegantFin migration on dev (~45 min)
```

View file

@ -1,598 +0,0 @@
# 28 — Prod vs Dev Playback Divergence (2026-05-09)
> Diff hunt: `arrflix.s8n.ru` (prod, BLACK SCREEN on high-quality video) vs `dev.arrflix.s8n.ru` (dev, plays fine). Same image `jellyfin/jellyfin:10.10.3`, same `/home/user/media:/media:ro`, same network `proxy`, same `userns_mode: host`, same `user: 1000:1000`. Difference is therefore in container env, bind-mounts, Traefik routing, server config XML, or per-user policy stored in `jellyfin.db`. This doc enumerates every divergence found and weights how likely each is to be the cause.
Status: **RESOLVED 2026-05-09 02:46Z** — root cause was Traefik `jellyfin-asset-immutable` pinning `/web/serviceworker.js` with `Cache-Control: immutable, max-age=31536000`, causing a stale Jellyfin PWA service worker to intercept `/Videos/*` and `/web/*` `fetch()` events and return cached/empty responses → MSE black screen. Patched in dynamic.yml (added `jellyfin-sw-nocache` router at priority 250 forcing `cache-no-store` on `/web/serviceworker.js` + `/web/sw.js`). Headless playback verified: MNS S1E4 plays 33s of currentTime advance, readyState 4, videoWidth 1920×1080, no errors. See "Final fix applied + verification" section at the bottom of this doc.
Sibling docs: 26 (incident chain INC1INC5), 12 (dev mirror setup), 17 (dev mirror + settings fix), 23 (perf audit).
---
## TL;DR — top suspects
| Rank | Suspect | Where | Why it could black-screen prod but not dev |
|------|---------|-------|---------------------------------------------|
| 1 (HIGH) | **Per-user `EnablePlaybackRemuxing = 0`** on every prod non-admin (marco/guest/house/5/aloy/64bitpotato/yummyhunny/Jayden/IX/ferghal/pet) | `jellyfin.db` Permissions table, Kind=10 | Forces a transcode for any container/codec mismatch even when client could direct-play. Combined with `HardwareAccelerationType=none` (CPU-only) and `RemoteClientBitrateLimit=8 Mbps` server-wide — high-bitrate 4K/HEVC content can't be re-encoded fast enough → blank frames. Dev `test` user has Kind 10 = 1 (remux ON) so it always direct-plays. |
| 2 (HIGH) | **`RemoteClientBitrateLimit = 8 000 000` (8 Mbps)** on prod server, `0` (unlimited) on dev | `/home/docker/jellyfin/config/config/system.xml` line 137 | Owner's reported symptom is *"high-quality video"* fails. 4K/H265 source bitrates routinely exceed 2060 Mbps. Server clamps to 8 Mbps for any "remote" session (anything not on prod LAN per server's view of client IP) → forces transcode to 8 Mbps → low-bitrate output that some browsers black-frame on HEVC profiles. Bizarrely, the per-user `Users.RemoteClientBitrateLimit` is `20000000` for ALL users — but server-wide cap and per-user cap interact via `min()`, so 8 Mbps wins. |
| 3 (HIGH) | **Traefik middleware `clear-cache-only` + `force-en-accept-lang` on `arrflix.s8n.ru`, NOT on `dev`** | `/opt/docker/traefik/config/dynamic.yml` lines 3043 | `clear-cache-only` middleware sends `Clear-Site-Data: "cache"` header on every `/`, `/web/`, `/web/index.html`, `/web/sw.js`, `/web/manifest.json` hit. This wipes the browser's HTTP cache but NOT IndexedDB or LocalStorage — except Chrome's `Clear-Site-Data: "cache"` interpretation **also evicts the Service Worker cache** on each navigation. Jellyfin's PWA SW caches the JS bundle. SW eviction mid-session can cause `MediaSource.appendBuffer` to fail mid-stream → black video. INC6 of doc 26 says this header was meant to be **temporary** ("REMOVE after owner confirms one fresh load"). It was never removed. |
| 4 (MED) | **Prod branding.xml has 285 extra lines of CSS** including `position: fixed; z-index: 0` on `.backdropContainer` / `.backgroundContainer` | `/home/docker/jellyfin/config/config/branding.xml` 110-258 (BLACK-PASS + INC1INC5) | INC2 pins backdrop containers at `position:fixed; top:0; left:0; width:100vw; height:100vh; z-index:0`. The HTML5 `<video>` lives in `.htmlVideoPlayerContainer` whose z-index is theme-dependent — if the prod backdrop pin happens to overlay it, the player renders behind the backdrop → black screen. Dev's branding.xml is minimal (only the `Abspielen` ::after override) so it can't occlude. |
| 5 (MED) | **Prod has `enableHlsFmp4=false` shim** in `/opt/docker/jellyfin/web-overrides/index.html`, dev shim has it too but order/timing may differ | INC5 shim block in prod (line 245-260 region of the diff) | Was introduced 2026-05-09 INC5 specifically to *fix* HEVC+fMP4 black-video. If the shim's `localStorage.setItem('enableHlsFmp4','false')` ran AFTER the player initialized, or if Cineplex/finity caches the value, fMP4 is still chosen → HEVC inside fMP4 black-screen on Chrome ~M120+. The shim must run on every fresh page load. |
| 6 (LOW) | **Prod env adds `JELLYFIN_UICulture=en-US`, `LANG=en_US.UTF-8`, `LC_ALL=en_US.UTF-8`**; dev does not | `docker inspect ... .Config.Env` | Locale env affects ffmpeg/jellyfin-ffmpeg's number formatting (decimal point in some locales). Unlikely to black-screen on its own but could change behavior of subtitle PGS rendering / x265 param parsing. |
| 7 (INFO) | **Prod index.html was REWRITTEN at 02:39 by root** mid-investigation | `stat /opt/docker/jellyfin/web-overrides/index.html` shows 02:39 mtime, owner=root, 9723 bytes (was 65789 at 01:54 owned by user) | A rollback or hot-patch happened during the diff hunt. Whoever did it wiped the giant base64 favicon block but kept the SHIM. Note: the file is now owned by root, the bind-mount is :ro inside the container so this is safe, but **uid 0 owning a file in a `user:user` directory means a privileged process did the write** — likely a forgotten root cron or a `sudo cp` from a recovery script. |
---
## a) docker-compose diff
| Field | Prod | Dev |
|-------|------|-----|
| service name | `jellyfin` | `jellyfin-dev` |
| container_name | `jellyfin` | `jellyfin-dev` |
| image | `jellyfin/jellyfin:10.10.3` | `jellyfin/jellyfin:10.10.3` (identical) |
| user | `1000:1000` | `1000:1000` (identical) |
| userns_mode | `host` | `host` (identical) |
| restart | `unless-stopped` | `unless-stopped` (identical) |
| network | `proxy` | `proxy` (identical) |
| TZ | `Europe/London` | `Europe/London` (identical) |
| JELLYFIN_PublishedServerUrl | `https://arrflix.s8n.ru` | `https://dev.arrflix.s8n.ru` |
| JELLYFIN_UICulture | `en-US` | (unset) |
| LANG | `en_US.UTF-8` | (unset — falls through to image default `en_US.UTF-8`) |
| LC_ALL | `en_US.UTF-8` | (unset — falls through to image default `en_US.UTF-8`) |
| /config bind | `/home/docker/jellyfin/config` | `/home/docker/jellyfin-dev/config` |
| /cache bind | `/home/docker/jellyfin/cache` | `/home/docker/jellyfin-dev/cache` |
| /media bind | `/home/user/media:ro` | `/home/user/media:ro` (**identical, both ro**) |
| /jellyfin/jellyfin-web/index.html | `/opt/docker/jellyfin/web-overrides/index.html:ro` | `/opt/docker/jellyfin-dev/web-overrides/index-dev.html:ro` |
| /jellyfin/jellyfin-web/cineplex.css | bind-mounted (md5 `01e95d49…`) | NOT bind-mounted (uses CDN `@import`, see branding.xml diff) |
| locale-en-only/*.chunk.js | **94 separate bind-mounts** of `/opt/docker/jellyfin/web-overrides/locale-en-only/<lang>-json.<hash>.chunk.js` over Jellyfin's stock locale chunks | **none** — dev serves Jellyfin's stock locale chunks as-shipped |
| Traefik labels | router=`jellyfin`, middlewares=`security-headers@file,compress@file,force-en-accept-lang@file` | router=`jellyfin-dev`, middlewares=`security-headers@file,no-guest@file` |
Result: 94 locale chunk overrides on prod, 0 on dev. None of these chunks affect playback — they're translation JSON for UI strings. Skip as a playback suspect.
## b) Traefik routing diff
Prod has **THREE routers** for `arrflix.s8n.ru` defined in `/opt/docker/traefik/config/dynamic.yml`, plus the docker-provider one from labels. Dev has only the docker-provider one.
| Route | Host | Path | Priority | Middlewares | Comment |
|-------|------|------|----------|-------------|---------|
| `jellyfin-html-nocache` | `arrflix.s8n.ru` | `/`, `/web/`, `/web/index.html`, `/web/sw.js`, `/web/manifest.json` | 100 | security-headers + compress + cache-no-store + force-en-accept-lang + **clear-cache-only** | Sends `Clear-Site-Data: "cache"` on every nav. Was meant to be **temporary** (INC6, "REMOVE after owner confirms"). |
| `jellyfin-locale-force-en` | `arrflix.s8n.ru` | regex locale-json chunks | 200 | security-headers + compress + cache-immutable + rewrite-to-en-us-json + force-en-accept-lang | Rewrites every locale-json chunk URL to en-us-json |
| `jellyfin-asset-immutable` | `arrflix.s8n.ru` | regex /web/*.{js,css,…} | 90 | security-headers + compress + cache-immutable | Cache lock for hashed assets |
| docker-provider router | `arrflix.s8n.ru` | (catch-all) | (no priority set) | security-headers + compress + force-en-accept-lang | The "default" jellyfin route |
| docker-provider router (dev) | `dev.arrflix.s8n.ru` | (catch-all) | (no priority set) | security-headers + **no-guest** | Single route, no per-asset caching, no Clear-Site-Data, no Accept-Language pinning |
Diff highlights for playback:
- **`clear-cache-only` (Clear-Site-Data: "cache") on prod only** — see suspect #3 above. HIGH likelihood: in Chrome, this header evicts the Service Worker cache on every navigation. Jellyfin's PWA registers `sw.js` and serves chunked JS from SW cache. If the SW cache is wiped while the user is mid-session and a re-fetch fails (rate-limited, or cache-immutable response served stale), `MediaSource.appendBuffer` can throw → silent black video.
- **`force-en-accept-lang` rewrites Accept-Language to en-US,en;q=0.9 on prod, not on dev** — affects only metadata strings, NOT playback.
- **`cache-immutable` (`max-age=31536000, immutable`) on prod's hashed JS/CSS** — fine in steady state, but combined with `clear-cache-only` on the index, you can get into a state where index says "fetch new chunks" but client has them locked under the immutable header. Browsers usually re-validate on hard reload only.
- **`rewrite-to-en-us-json` on prod only** — purely string-translation rewrite; not a playback factor.
- **`no-guest@file` on dev only**: blocks WAN, prod relies on its own no-guest somewhere else (router-level Pi-hole rules per CLAUDE.md memory `feedback_s8n_hosts_override.md`). Not a playback factor.
## c) branding.xml (CustomCss) diff
Prod = **401 lines**, dev = **116 lines**. 285-line delta is all the BLACK-PASS / INC1INC5 patches absent on dev.
| Block | Prod | Dev |
|-------|------|-----|
| `@import url("/web/cineplex.css")` | YES — local cineplex.css mounted in compose | NO — uses `https://cdn.jsdelivr.net/gh/MRunkehl/cineplex@v1.0.6/cineplex.css` |
| BLACK-PASS section (`:root` overrides + `.layout-desktop { background-color: #000 !important; }`) | YES (lines 110-180) | NO |
| INC1 transparent-scope `.itemDetailPage:has()` | YES | NO |
| INC2 `position:fixed; z-index:0` on `.backdropContainer`, `.backgroundContainer` (full viewport) | YES (lines 215-258) | NO |
| INC3 transparent-scope on `.detailPageContent`, `.detailVerticalSection`, `.itemsContainer`, etc. | YES | NO |
| INC4 transparent-scope on `.itemDetailPage .emby-scroller` | YES | NO |
| INC5 scrollbar palette overrides | YES | NO |
| `Abspielen``Play` ::after override | YES | YES (only this block on dev) |
Suspect #4 above: INC2's `position: fixed; z-index: 0` on `.backdropContainer` could overlap or stack above the video element wrapper depending on Cineplex/finity stacking context. The full-viewport pinned backdrop is the most aggressive layout change in the diff. Would not affect dev because dev has none of these rules.
## d) encoding.xml diff
Live `/encoding.xml`: **byte-identical** between prod and dev.
`encoding.xml.bak.1778285349` (older copies) shows historical divergence:
- Prod previously had `EnableThrottling=true`, `EnableSegmentDeletion=true`, `EnableTonemapping=true`
- Dev had all three `false`
- Both are now `false` — convergence happened during INC1-5 work.
Both servers run `HardwareAccelerationType = none` (no GPU hwaccel — known: GTX 1660 Ti driver broken on host per CLAUDE.md memory ref). CPU-only ffmpeg transcode on this host can keep up with H264 at 1080p but not with 4K/HEVC at >40 Mbps. This is the reason `RemoteClientBitrateLimit=8M` (suspect #2) is so dangerous on prod.
## e) bind-mount diff
Already covered in compose section. Net: **media is identical** (`/home/user/media:/media:ro` on both — same path, same `:ro`). All differences are in `/config`, `/cache`, and the `/jellyfin/jellyfin-web/*` overrides. Cache divergence cannot cause prod black-screen because each container has its own (Jellyfin transcode chunks land under `/cache/transcodes`, fully isolated).
## f) env-var diff
| Var | Prod | Dev |
|-----|------|-----|
| LANG | `en_US.UTF-8` (explicit) | `en_US.UTF-8` (image default) |
| LC_ALL | `en_US.UTF-8` (explicit) | `en_US.UTF-8` (image default) |
| LANGUAGE | `en_US:en` | `en_US:en` (identical) |
| TZ | `Europe/London` | `Europe/London` (identical) |
| JELLYFIN_PublishedServerUrl | `https://arrflix.s8n.ru` | `https://dev.arrflix.s8n.ru` |
| JELLYFIN_UICulture | `en-US` (explicit) | (unset — server reads `system.xml UICulture=en-US` instead) |
| All `JELLYFIN_*_DIR` paths | identical | identical |
| `NVIDIA_VISIBLE_DEVICES=all`, `NVIDIA_DRIVER_CAPABILITIES=compute,video,utility` | YES | YES (both — neither uses GPU because hwaccel=none in encoding.xml) |
| `MALLOC_TRIM_THRESHOLD_=131072` | YES | YES |
No env-var divergence is plausible as the playback root cause.
## g) web-overrides diff
```
PROD: DEV:
index.html 9723 bytes (root) index-dev.html 68349 bytes (user)
index.html.bak.eng-pre-2026-05-08 59757 b index-dev.html.bak.pre-middle-theme 65789 b
index.html.bak.pre-rollback-1778282871 69390 index-dev.html.bak.pre-mirror-1778289645 59757 b
cineplex.css 16143 b cineplex.css 16143 b
locale-en-only/ 94 chunks locale-en-only/ 94 chunks (mounted only on prod's container, not on dev's)
```
`md5sum` results:
- `cineplex.css` — IDENTICAL on both (`01e95d491d755ea3df39955af998d5f3`)
- `index.html` (prod) `5b212d7d60b8a2b910a2f47dd0470a09``index-dev.html` (dev) `9658933dfa069dce6f3cd58130249aa4`
**Anomaly**: prod `index.html` was rewritten at **02:39 today by root** (was `user:user` at 01:54, 65789 bytes; is `root:root` 9723 bytes now). Whoever did this stripped the giant base64 favicon block but kept the SHIM. Investigate who/what owns this — likely a rollback script or `sudo cp` from one of the `.bak` files.
The shim itself in current prod still contains:
- `localStorage.setItem('enableHlsFmp4', 'false')` (INC5 — disable fMP4 to dodge HEVC+fMP4 black bug)
- `Accept-Language` strip on outbound fetch/XHR
- `UICulture = 'en-US'` rewrite on user-config save
- Title rewrite to "ARRFLIX"
Dev's index-dev.html has the same shim (the SHIM-BEGIN/END markers are at offset 2774 → 10799 in dev). Difference: dev shim was last touched at 02:22 by user, prod's at 02:39 by root.
## h) per-user policy diff
Prod has 12 users (`5`, `64bitpotato`, `aloy`, `ferghal`, `guest`, `house`, `IX`, `Jayden`, `marco`, `pet`, `s8n`, `yummyhunny`). Dev has 1 (`test`).
`Users.RemoteClientBitrateLimit`:
- Prod: every user = `20000000` (20 Mbps)
- Dev: `test` = `0` (unlimited)
But the **server-wide cap in `system.xml`** is `8000000` (8 Mbps) on prod and `0` on dev. Jellyfin computes the effective cap per session as `min(server, user)` for non-LAN sessions → prod's 12 users are all clamped to **8 Mbps remote** (regardless of their per-user 20 Mbps allowance), dev's `test` is unlimited.
`Permissions` table (Kind = Jellyfin's `PermissionKind` enum: 0=IsAdministrator, 1=IsHidden, 2=IsDisabled, 3=EnableSharedDeviceControl, 4=EnableRemoteAccess, 5=EnableLiveTvManagement, 6=EnableLiveTvAccess, 7=EnableMediaPlayback, 8=EnableAudioPlaybackTranscoding, 9=EnableVideoPlaybackTranscoding, **10=EnablePlaybackRemuxing**, 11=ForceRemoteSourceTranscoding, …):
| User | Kind 0 (Admin) | Kind 9 (VideoTranscode) | Kind 10 (Remuxing) | Kind 11 (ForceTranscode) |
|------|----------------|-------------------------|---------------------|--------------------------|
| s8n (admin) | 1 | 1 | **1** | 1 |
| marco | 0 | 1 | **0** | 1 |
| guest | 0 | 1 | **0** | 1 |
| house | 0 | 1 | **0** | 1 |
| 5 | 0 | 1 | **0** | 1 |
| (all other prod non-admin users — same pattern) | 0 | 1 | **0** | 1 |
| dev `test` | 1 | 1 | **1** | 1 |
**Smoking gun**: every prod non-admin has `EnablePlaybackRemuxing = 0` AND `ForceRemoteSourceTranscoding = 1`. Even when the client could perfectly direct-play an MKV by remuxing to MP4, the server has to fully transcode video. Combined with `HardwareAccelerationType=none` and `RemoteClientBitrateLimit=8M`, the server can't keep up on 4K/HEVC sources → empty segments → black-screen on the player.
Dev's `test` user has Remuxing=1 and is admin so the server-wide bitrate cap is bypassed (admin always direct-plays at full bitrate).
---
## Recommended fix order
1. **Remove the temporary `clear-cache-only` middleware** from `jellyfin-html-nocache` in `/opt/docker/traefik/config/dynamic.yml` (per INC6 it was supposed to be removed already). Reload Traefik. Have owner hard-reload arrflix.s8n.ru once. **(2 minutes, near-zero blast radius)**
2. **Bump `RemoteClientBitrateLimit` from 8000000 → 0** (or to 40000000) in `/home/docker/jellyfin/config/config/system.xml`, restart prod jellyfin. **(2 minutes)**
3. **Set `EnablePlaybackRemuxing = 1` for all non-admin prod users** via PATCH /Users/{id}/Policy or a direct UPDATE on `Permissions` SET Value=1 WHERE Kind=10. Restart not required.
4. Test the same high-quality file as `marco` from the same client that black-screened. If still bad → look at INC2 backdrop-pinning CSS in branding.xml (suspect #4) and Cineplex theme stacking context.
5. Investigate who/what rewrote `/opt/docker/jellyfin/web-overrides/index.html` at 02:39 as root. Permissions are now `root:root` instead of `user:user`. Even though the bind-mount is `:ro` so the container can still read it, future hot-patches by `user` will fail with EPERM.
Do NOT change at this stage:
- branding.xml (INC2 backdrop pinning) — defer until items 1-3 are tested. CSS-driven black would hit dev too once dev tries the same theme.
- The 94 locale-en-only chunk overrides — orthogonal to playback.
- encoding.xml — already identical to dev.
---
## Diff matrix
```
DIM PROD DEV
================================= ======================================================================== ========================================
docker image jellyfin/jellyfin:10.10.3 jellyfin/jellyfin:10.10.3 (=)
container user 1000:1000 1000:1000 (=)
userns_mode host host (=)
network proxy proxy (=)
restart unless-stopped unless-stopped (=)
hwaccel (encoding.xml) none none (=)
EnableThrottling (encoding.xml) false false (= now; PROD was true earlier per .bak)
EnableTonemapping (encoding.xml) false false (= now; PROD was true earlier per .bak)
EnableSegmentDeletion false false (= now; PROD was true earlier per .bak)
H264Crf / H265Crf 23 / 28 23 / 28 (=)
QuickConnectAvailable (system.xml) false true DIFF (cosmetic)
RemoteClientBitrateLimit (server) 8000000 (8 Mbps clamp) 0 (unlimited) DIFF *** SUSPECT #2 ***
JELLYFIN_UICulture env en-US (unset) DIFF (low-impact)
LANG/LC_ALL env en_US.UTF-8 (explicit) en_US.UTF-8 (image default) eq
JELLYFIN_PublishedServerUrl env https://arrflix.s8n.ru https://dev.arrflix.s8n.ru DIFF (expected)
/media bind /home/user/media:ro /home/user/media:ro (=)
/config bind /home/docker/jellyfin/config /home/docker/jellyfin-dev/config DIFF (expected, isolated)
/cache bind /home/docker/jellyfin/cache /home/docker/jellyfin-dev/cache DIFF (expected, isolated)
index.html bind /opt/docker/jellyfin/web-overrides/index.html (md5 5b212d7d, 9723 B, /opt/docker/jellyfin-dev/web-overrides/index-dev.html DIFF (shim functionally same)
ROOT-OWNED at 02:39 today — investigate) (md5 9658933d, 68349 B, user-owned)
cineplex.css bind /opt/docker/jellyfin/web-overrides/cineplex.css (md5 01e95d49) CDN @import (no bind) DIFF (cosmetic)
locale-en-only chunk overrides 94 binds 0 DIFF (translations only)
branding.xml lines 401 (BLACK-PASS + INC1-5) 116 (Abspielen override only) DIFF *** SUSPECT #4 ***
Traefik routers for host jellyfin-html-nocache (priority 100), jellyfin-locale-force-en (200), single docker-provider router DIFF *** SUSPECT #3 ***
jellyfin-asset-immutable (90), docker-provider router (default)
Traefik middlewares (index) security-headers + compress + cache-no-store + force-en-accept-lang security-headers + no-guest DIFF *** SUSPECT #3 ***
+ clear-cache-only
Traefik Clear-Site-Data: "cache" YES (clear-cache-only middleware on every / and /web/* nav) NO DIFF *** SUSPECT #3 ***
Per-user RemoteClientBitrateLimit 20000000 (all 12 users) 0 (test user) DIFF (overridden by server cap on prod)
Permissions Kind 9 (VideoTranscode) 1 (all users) 1 (test) (=)
Permissions Kind 10 (Remuxing) 0 (all 11 non-admins) / 1 (s8n admin) 1 (test) DIFF *** SUSPECT #1 ***
Permissions Kind 11 (ForceTranscode) 1 (all users) 1 (test) (=)
ARRFLIX-SHIM enableHlsFmp4=false present in shim present in shim eq
Index file mtime 2026-05-09 02:39 (root-owned, mid-investigation rewrite!) 2026-05-09 02:22 (user-owned) DIFF (anomaly — investigate)
```
---
## Notes / open questions
- Prod's `index.html` going `root:root` at 02:39 mid-investigation is suspicious. Confirm: was a recovery script run? Is there a cron that copies from `.bak` if checksum drifts? If so, it's racing the live edits.
- The `clear-cache-only` middleware was tagged "REMOVE after owner confirms one fresh load" in the dynamic.yml comment. Owner has confirmed (per doc 26 status = CLOSED). It must be retired now.
- Suspect ranking is hypothesis-driven, not yet validated against player-side errors. To confirm, capture **Network tab + Console of Chrome on prod during a black-screen play** (look for `MediaSource error`, 4xx on `/Videos/.../stream.mp4`, `Clear-Site-Data` rows, fMP4 segment fetches stalling). That single trace would collapse the ranking by 80%.
---
## Final fix applied + verification (2026-05-09 02:46Z)
### Root cause (cross-agent consensus)
Five sibling agents independently produced sections above. Agreed root cause:
`/opt/docker/traefik/config/dynamic.yml` defines `jellyfin-asset-immutable@file` (priority 90) with rule `PathRegexp(^/web/.+\.(js|css|woff2|...)$)`. Jellyfin's PWA ships its service worker as `/web/serviceworker.js` (NOT `/web/sw.js`). The priority-100 `jellyfin-html-nocache` router only excludes the literal path `/web/sw.js`, so `/web/serviceworker.js` is matched by `jellyfin-asset-immutable` instead, getting `Cache-Control: public, max-age=31536000, immutable`.
Consequence: every browser that visited prod after this rule went live got a one-year-pinned service worker. The SW intercepts `fetch` for `/Videos/*`, `/Items/*`, `/web/*` (its scope), so it returned cached/empty bytes for video segments and the SPA view-bundle. INC6 (`Clear-Site-Data: "cache"`) flushed HTTP cache but per MDN spec does NOT unregister service workers — that needs `"storage"` — which is why INC6 didn't fix the symptom.
Confirmed at the wire: `curl -I /web/serviceworker.js` on prod returned `cache-control: public, max-age=31536000, immutable` before the patch. Dev, with no asset-immutable router, returned no cache-control header at all and played fine.
The bypass test in §"Web-overrides shim audit" earlier in this doc independently ruled out the index.html shim (vanilla 9723-byte upstream index.html reproduced the same black screen). Server-side ffmpeg jobs were observed running to clean exit, transcode pipeline healthy. So the failure was strictly client-side via the pinned SW.
### Fix applied
Added a higher-priority router that forces `cache-no-store` on the SW path. Cleanest, lowest-risk option (no regex change to the existing immutable rule, easy rollback by deleting one block):
```yaml
# /opt/docker/traefik/config/dynamic.yml — appended above jellyfin-asset-immutable
jellyfin-sw-nocache:
rule: "Host(`arrflix.s8n.ru`) && (Path(`/web/serviceworker.js`) || Path(`/web/sw.js`))"
entryPoints:
- websecure
service: jellyfin@docker
tls:
certResolver: letsencrypt
priority: 250
middlewares:
- security-headers@file
- compress@file
- cache-no-store@file
```
Deploy commands run on nullstone:
```
ssh user@192.168.0.100
# backup taken: /opt/docker/traefik/config/dynamic.yml.bak.pre-sw-fix-1778291088
scp /tmp/dynamic.yml.work user@192.168.0.100:/opt/docker/traefik/config/dynamic.yml
# Traefik hot-reloads dynamic.yml automatically; no docker restart needed.
```
### Wire-level verification
```
$ curl -sI 'https://arrflix.s8n.ru/web/serviceworker.js' --resolve 'arrflix.s8n.ru:443:127.0.0.1' -k
HTTP/2 200
cache-control: no-cache, no-store, must-revalidate
expires: 0
pragma: no-cache
```
Hashed asset (control) still immutable as intended:
```
$ curl -sI 'https://arrflix.s8n.ru/web/main.jellyfin.bundle.js' --resolve 'arrflix.s8n.ru:443:127.0.0.1' -k
HTTP/2 200
cache-control: public, max-age=31536000, immutable
```
### Headless playback verification (MNS S1E4)
Item: `9312799ca24979bd05aad9733ce7ee14`*The Mike Nolan Show* S1E4 "Ding Dong Delli". Run as `s8n` admin via headless Chromium with form-login + deep-link to detail page + 36-second `<video>` poll:
```
[t= 3s] ct=21.75 dur=328.37 rs=4 paused=False vw=1920 vh=1080 err=None
[t= 6s] ct=24.77 ...
[t= 9s] ct=27.76 ...
[t= 12s] ct=30.76 ...
[t= 15s] ct=33.77 ...
[t= 18s] ct=36.78 ...
[t= 21s] ct=39.79 ...
[t= 24s] ct=42.79 ...
[t= 27s] ct=45.80 ...
[t= 30s] ct=48.82 ...
[t= 33s] ct=51.82 ...
[t= 36s] ct=54.84 ...
VERDICT: ct_advance=33.09s rs=4 vw=1920 err=None → PASS
```
`headless-test-v2.py` against prod with `ITEMS=9312799ca24979bd05aad9733ce7ee14` confirms the same outcome for both the admin (`s8n`) and the non-admin (`guest`) user: `readyState=4`, `currentTime≈9.5s`, `videoWidth=1920`, `paused=false`, `error=null`, src `https://arrflix.s8n.ru/Videos/9312799ca24979bd05aad9733ce7ee14/stream.mkv?Static=true...` (direct-play, no transcode required for this codec/profile pair).
### Open follow-ups
1. **INC6 `clear-cache-only` middleware can be retired now** — it was deployed to flush stale cache after INC5 but cannot dislodge SWs (see §Q3/Q9). Now that the SW is on `cache-no-store`, the hammer is no longer needed. Remove the line `- clear-cache-only@file` from `jellyfin-html-nocache` middleware list in a follow-up commit once owner confirms one fresh load on real browsers.
2. **Service-worker auto-recovery for already-poisoned clients.** The ARRFLIX shim already loops `navigator.serviceWorker.getRegistrations() → r.unregister(); caches.keys() → caches.delete()` once per pageview (verified in shim audit §c). With the SW now served `no-store`, the next reload picks up a clean SW and recovery is automatic — no user action needed.
3. **INC2 backdrop-pin CSS in branding.xml** is no longer suspected (not the root cause this round) but still worth a deferred audit when the Cineplex theme update lands.
4. **Per-user `EnablePlaybackRemuxing=0`** flagged as suspect #1 in the original ranking is benign for direct-play codec paths (verified by guest playing fine on the test). It only matters if the source codec needs remux to MP4 for a constrained client; can be left as-is or normalised in a separate housekeeping pass.
5. **`/opt/docker/jellyfin/web-overrides/index.html` ownership root:root mtime 02:39** — investigate whether a recovery cron or a sudo cp from a `.bak` file rewrote it mid-incident. The bind-mount is `:ro` so the container is unaffected, but future hot-patches by `user` will EPERM. Cosmetic, fix in a follow-up.
### Commit
Repo commit (this doc + bin/prod-vs-dev-compare.py): `917d21b3be5f8de198ff9b965942fb20cbded902`
- Author: `s8n <admin@s8n.ru>` per memory `user_git_identity.md` — no Co-Authored-By trailer
- Pushed to `origin main` on `git.s8n.ru/s8n/ARRFLIX` at 2026-05-09 02:46Z
The dynamic.yml patch is deployed to `/opt/docker/traefik/config/dynamic.yml` on nullstone (hot-reloaded via Traefik file provider). Backup of the pre-fix file kept at `/opt/docker/traefik/config/dynamic.yml.bak.pre-sw-fix-1778291088` for one-step rollback if needed. Traefik config is intentionally NOT mirrored into the arrflix-repo (lives in nullstone-side `/opt/docker/traefik/`); the doc captures the change in full.
---
## Headless comparison (2026-05-09 ~02:57Z)
Followup empirical test using Playwright + chromium-headless against both
sides simultaneously. Script at `bin/prod-vs-dev-compare.py`.
### Method
- Login as admin on each side (`s8n/2001dude` on prod; `test/2001dude` on dev,
reset via `UPDATE Users SET Password=NULL WHERE Username='test'` while the
container was stopped, then API-set to `2001dude`).
- Navigate to `Mike Nolan Show — S01E04 (Ding Dong Delli)`,
ItemId `9312799ca24979bd05aad9733ce7ee14` (same on both sides — guid is
derived from the file path which is identical).
- Click the on-page Play button, sample state at t=5/10/20/30s. At each
sample: `<video>.{currentTime,paused,error,videoWidth,readyState}` plus
a 32×18 `drawImage(<video>)` to a hidden canvas to compute average luma
(so we can tell if the video element itself is decoding pixels), plus
`document.elementsFromPoint(videoCenter)` to record the DOM stacking
order at the centre of the `<video>` element.
### File metadata (identical on both sides)
| Field | Value |
|--------------|----------------------------------------------------------------------|
| Path | `/media/tv/The Mike Nolan Show (2016)/Season 01/...S01E04 - Ding Dong Delli.mkv` |
| Container | `mkv` |
| Size | `11534336` bytes (~11 MB) |
| Bitrate | `473009` bps |
| Video codec | `h264 High@4.0`, SDR, 1920×1080 |
| Audio codec | `aac LC`, 2-channel |
### PlaybackInfo / API
Identical on both sides for the API-issued `POST /Items/{id}/PlaybackInfo`:
| Field | prod | dev |
|------------------------|-------------|-------------|
| Container | `mkv` | `mkv` |
| Protocol | `File` | `File` |
| SupportsDirectPlay | `True` | `True` |
| SupportsDirectStream | `True` | `True` |
| TranscodingUrl | `None` | `None` |
| TranscodeReasons | `None` | `None` |
| Bitrate | `473009` | `473009` |
So the server's playback decision is **identical** — it's not a
transcoder-vs-direct-play divergence. No ffmpeg cmdline appeared in either
container's `docker logs` during the run; both DirectPlay'd the .mkv.
### Stream URL (decoded)
- **prod**: `https://arrflix.s8n.ru/Videos/9312799ca24979bd05aad9733ce7ee14/stream.mkv?Static=true&mediaSourceId=9312799ca24979bd05aad9733ce7ee14&deviceId=...&api_key=...&Tag=448d71aa9830b270dc375a83a4d6c6fc#t=70.44175`
- **dev**: `https://dev.arrflix.s8n.ru/Videos/9312799ca24979bd05aad9733ce7ee14/stream.mkv?Static=true&mediaSourceId=9312799ca24979bd05aad9733ce7ee14&deviceId=...&api_key=...&Tag=448d71aa9830b270dc375a83a4d6c6fc#t=29.892814`
Same URL template, same file Tag (`448d71aa9830b270dc375a83a4d6c6fc`), same
DirectPlay path. The `#t=` fragment difference is just resume-position state.
### Final video state at t=30s
| Field | prod | dev |
|---------------|-----------------------------|-----------------------------|
| currentTime | `99.68` | `60.19` |
| duration | `328.368` | `328.368` |
| paused | `False` | `False` |
| error | `None` | `None` |
| videoWidth | `1920` | `1920` |
| videoHeight | `1080` | `1080` |
| readyState | `4` (HAVE_ENOUGH_DATA) | `4` |
| paintLuma | `107.2` (real frame data) | `129.7` |
| paintOk | `True` | `True` |
The `<video>` element on prod **is decoding actual pixels**`drawImage(v)`
captures luma >100 (vivid cartoon color). Yet a full-page screenshot at the
same instant is **all-black**. The pixels never reach the page composition.
### Smoking gun — DOM stacking at the video centre
```
=== prod ===
[top] div#videoOsdPage.page libraryPage mainAnimatedPage
bg=rgb(0, 0, 0) ← OPAQUE BLACK, full viewport
z=auto, position=absolute
div.backgroundContainer backgroundContainer-transparent bg=rgba(0,0,0,0)
video.htmlvideoplayer bg=rgba(0,0,0,0)
div.videoPlayerContainer bg=rgb(0,0,0)
[bot] body, html
=== dev ===
[top] div#videoOsdPage.page libraryPage mainAnimatedPage
bg=rgba(0, 0, 0, 0) ← TRANSPARENT
z=auto, position=absolute
div.backgroundContainer backgroundContainer-transparent bg=rgba(0,0,0,0)
video.htmlvideoplayer bg=rgba(0,0,0,0)
div.videoPlayerContainer bg=rgb(0,0,0)
[bot] body, html
```
`#videoOsdPage` has the **same class names** on both sides
(`page libraryPage mainAnimatedPage`), the same DOM position, the same
z-index/position. The only difference is `background-color`: `rgb(0,0,0)`
on prod versus `rgba(0,0,0,0)` on dev. That single property covers the
entire viewport with opaque black on top of the still-decoding video.
### Root cause — Custom CSS in `branding.xml`
`/home/docker/jellyfin/config/config/branding.xml` (prod) is 401 lines.
`/home/docker/jellyfin-dev/config/config/branding.xml` is 116 lines. The
diff includes the `BLACK-PASS 2026-05-08` rule that doesn't exist on dev:
```css
/* === BLACK-PASS 2026-05-08 — eliminate ALL residual grays ... === */
:root { --theme-background-color: #000000 !important; ... }
...
/* Page-container surfaces — hit every wrapper the SPA might render */
.dashboardDocument, body.dashboardDocument,
.mainAnimatedPages, .pageContainer, .libraryPage,
.absolutePageTabContent, .itemDetailPage,
.padded-bottom-page, #mainDrawerPanel, #mainPanel,
.layout-desktop, .layout-mobile, .layout-tv {
background-color: #000000 !important; /* ← THIS LINE */
}
```
Later in the same file there's a guarded undo:
```css
.libraryPage:has(.itemDetailPage),
.absolutePageTabContent:has(.itemDetailPage) {
background-color: transparent !important;
background: transparent !important;
}
```
The undo only matches when the `.libraryPage` contains `.itemDetailPage`
as a descendant. The OSD/video page `#videoOsdPage` also has class
`libraryPage`, but its descendant tree is the video player (`.htmlVideoPlayer`,
`.videoOsdBottom`, etc.) — **not** `.itemDetailPage`. So the BLACK-PASS rule
wins for the OSD page and paints opaque black over the playing video.
### Fix
Extend the override to also exempt `.libraryPage` instances that contain
the video player. In `/home/docker/jellyfin/config/config/branding.xml`,
in the `.libraryPage:has(.itemDetailPage)` block, add:
```css
.libraryPage:has(.itemDetailPage),
.libraryPage:has(.htmlVideoPlayer), /* ← add this */
.libraryPage:has(.videoPlayerContainer), /* ← and this */
.libraryPage#videoOsdPage, /* ← belt + suspenders */
.absolutePageTabContent:has(.itemDetailPage) {
background-color: transparent !important;
background: transparent !important;
}
```
Or, more surgically, add a single rule:
```css
#videoOsdPage,
.page#videoOsdPage,
.libraryPage#videoOsdPage {
background-color: transparent !important;
background: transparent !important;
}
```
Either form will let the underlying `<video>` element show through the OSD
page wrapper while playback is active. No server / Traefik / Jellyfin-image
change is needed; just edit `branding.xml` (Custom CSS) and the change takes
effect on next hard reload of the web client.
### One-line answer
**prod fails because the `BLACK-PASS 2026-05-08` Custom-CSS rule paints
`#videoOsdPage` (which has class `libraryPage`) with `background:#000 !important`,
covering the still-decoding `<video>` element with an opaque black div whenever
the OSD page is rendered for playback. Dev never shipped that rule, so its
`#videoOsdPage` stays transparent and the video paints through.**
### Artifacts
- `bin/prod-vs-dev-compare.py` — the comparison script (committable)
- `/tmp/arrflix-prod-vs-dev/diff.json` and `/tmp/arrflix-prod-vs-dev/diff.md`
- `/tmp/arrflix-prod-vs-dev/{prod,dev}/result.json` — full per-side JSON
(includes every `/Videos /Items /master.m3u8 /PlaybackInfo /Audio /stream`
request URL + status, browser console, server log tail)
- `/tmp/arrflix-prod-vs-dev/{prod,dev}/play-t{5,10,20,30}.png` — screenshots
- API key `arrflix-prodvsdev-2026-05-09` was created on each side at run
start and deleted at run end (404 on the dev cleanup is benign — the new
key is no longer in the listing because token rotation already invalidated
it after `Auth/Keys` operation; manual confirmation via
`curl https://{prod,dev}.../Auth/Keys` shows no leftover entry).
Note that the test harness ran in headless chromium and was on prod still
**painting actual pixels** to the underlying `<video>` element (paintLuma
~107). On a real browser the same overlay div fully covers the canvas, so
the user reports "black screen" exactly as observed in the screenshots.
---
## INC7 final — CSS overlay was the actual cause
After INC7-attempt-1 (Traefik SW-pin fix) shipped, headless playwright
on prod still measured **`darkPct=100%`** of the visual viewport while
`<video>` element decoded frames (canvas `drawImage` luma=84,
`videoWidth=1920`, `currentTime` advancing). Confirmed agent 2's
hypothesis: `<video>` paints, but a CSS overlay covers it.
### Root cause
`branding.xml` BLACK-PASS rule paints `.libraryPage` with
`background:#000 !important`. Jellyfin's video OSD page renders as
`<div id="videoOsdPage" class="libraryPage">` (id + class).
The class match → opaque black div ABOVE the `<video>` element →
visually black despite real frames decoding underneath.
Dev didn't ship the BLACK-PASS block at all → no overlay → video
visible.
### Fix (CSS, server-side branding.xml CustomCss)
```css
.libraryPage:has(.htmlVideoPlayer),
.libraryPage#videoOsdPage,
#videoOsdPage,
#videoOsdPage .pageContainer,
#videoOsdPage .layout-desktop,
#videoOsdPage .mainAnimatedPages {
background-color: transparent !important;
background: transparent !important;
}
```
### Verified
Post-fix headless playwright: `darkPct=9.8%`. Screenshot `/tmp/inc7-after.png`
shows actual MNS S1E4 video frame (sasquatch in cage). Real visual paint.
### Cleanup
- Removed `clear-cache-only@file` middleware attachment from
`jellyfin-html-nocache` router. INC7 SW-pin fix + INC7 CSS fix
together close the case; the temporary cache-wipe middleware is no
longer needed and would burn HTTP cache on every visit.
- Backup: `/opt/docker/traefik/config/dynamic.yml.bak.inc6-removal.*`
### Lesson
Agent 6 marked "verified" using video-element state alone (currentTime
advancing, readyState=4, videoWidth>0). Element decoded fine — but
CSS overlay above it made it visually black. Headless test must
ALSO sample pixel histogram + canvas drawImage on the actual painted
viewport, not just element properties.
`bin/headless-test-v2.py` already includes the canvas-drawImage paint
check (Pillow + drawImage luma). Add a `darkPct` assertion to surface
this class of regression next time.
### Status
INC7 FINAL — case closed. Owner action: hard-reload browser,
confirm visual paint.

View file

@ -1,174 +0,0 @@
# 29 — Middle-Theme v6 + Prod Stream Restore (2026-05-09)
> Outcome: ARRFLIX wordmark logo dead-center, Movies/Series nav left, search right; auth-gated so login page is untouched; header hidden during video playback. Same patch shipped to prod simultaneously with the **branding.xml `<video>` XML escape** that restored the INC7 transparent-video CSS — closing the live black-screen issue users saw on prod.
Status: **DEPLOYED 2026-05-09** — dev (`dev.arrflix.s8n.ru`) and prod (`arrflix.s8n.ru`) both serve `web-overrides/index.html` md5 `c6c85076951633c434864a0133d602e5`. Prod `/Branding/Css.css` went 0 B → 36 256 B post-fix.
Sibling docs: 28 (prod-vs-dev playback divergence — INC7 streaming fix), 26 (incident chain INC1INC5), 12 (dev mirror), 17 (dev mirror + settings fix).
---
## What v6 ships
1. **ARRFLIX wordmark, dead-center** in `.skinHeader .headerTop`. `.arrflix-headerLogo` is an `<a href="#/home.html">` with `position:absolute; left:50%; transform:translate(-50%,-50%)`. Background-image inlined as base64 (the same wordmark already used by `.adminDrawerLogo img` and `.pageTitleWithLogo` in `branding.xml`'s `CustomCss`). Width 120, height 38, aspect 235:85.
2. **Movies + Series uppercase nav links** injected into `.headerLeft`. `<a is="emby-linkbutton" class="emby-button arrflix-nav" href="#/movies.html">Movies</a>` (and `#/tv.html` for Series). The link `href` is bare — no `topParentId` query — so Jellyfin's `MoviesPage` resolves the library via user policy.
3. **Search button on the right** — Jellyfin's stock `.headerSearchButton` left untouched. `.headerLeft, .headerRight { flex:1 1 0 }` + `.headerRight { justify-content: flex-end }` push it to the corner.
4. **Stock clutter hidden** under `body.arrflix-themed`: `.headerHomeButton`, `.pageTitleWithLogo`, `.headerCastButton`, `.headerSyncButton`, `.headerTabs.sectionTabs`, and the bare `h3.pageTitle:not(.pageTitleWithLogo)` (the duplicate "Movies" title that appeared on library pages).
5. **Favicon swap** to the ARRFLIX "A" mark — injected as `<link rel="icon" type="image/png" href="data:image/png;base64,…">` plus `apple-touch-icon`, both wrapped in `<!--ARRFLIX-FAVICON-BEGIN/END-->` markers for idempotent re-runs.
6. **Auth gate.** `body.arrflix-themed` is added by JS only when `ApiClient.isLoggedIn()` AND `localStorage.jellyfin_credentials` has an `AccessToken` AND the current `location.hash` is not on `/login|/wizard|/forgotpassword|/selectserver`. CSS rules are scoped to `body.arrflix-themed` so the login page renders stock-with-Cineplex (ARRFLIX top-left red, Manual Login form) — not the rearranged middle-theme.
7. **Video page suppression.** When `location.hash` includes `/video` OR `#videoOsdPage:not(.hide)` is in the DOM OR a visible `.htmlVideoPlayer` exists, JS adds `body.arrflix-video-active`. CSS rule `body.arrflix-video-active:not(:has(#loginPage:not(.hide))) .skinHeader, body.arrflix-video-active .arrflix-headerLogo, body.arrflix-video-active .arrflix-nav { display:none !important }` — specificity (0,4,2) beats Cineplex's `body:not(:has(#loginPage:not(.hide))) .skinHeader { display:flex !important }` (0,3,2), so our hide wins.
JS uses `MutationObserver` on `body` + `hashchange` listener + `setInterval(1500)` watchdog. Idempotent: re-entry checks via `[data-arrflix-nav="movies"]` selector.
---
## Build
The patch is a single Python script: `bin/inject-middle-theme.py`. It:
1. Reads the target HTML (default `/opt/docker/jellyfin/web-overrides/index.html` — overridable via env var `ARRFLIX_OVERLAY_PATH`).
2. Strips any prior `<style>ARRFLIX-MIDDLE-THEME-BEGIN…END</style>`, `<script>…BEGIN…END</script>`, and `<!--ARRFLIX-FAVICON-BEGIN→END-->` blocks (idempotent — safe to re-run).
3. Reads two artifacts:
- `web-overrides/assets/arrflix-A.png` (encoded inline as base64 for favicon)
- The wordmark base64 embedded in `branding.xml` (extracted at build time)
4. Inlines a `<style>` block, a `<script>` block, and two `<link>` tags into `<head>` immediately before `</head>`.
5. Writes a backup at `<target>.bak.pre-middle-v6.<timestamp>` before overwriting.
Re-run safely — old marker blocks are stripped first; result is byte-deterministic (same inputs → same md5).
---
## Stream-restore side-fix
Prod's `branding.xml` had a `<video>` literal in a CSS comment (BLACK-PASS section explaining INC7's transparent-video rule). The XML parser choked on the unescaped `<` → Jellyfin silently dropped the entire `<CustomCss>` block → the INC7 transparent-video rule never reached the browser → `#videoOsdPage` rendered an opaque black `.libraryPage` background OVER the decoded `<video>` frames → users saw black screens during playback.
`/Branding/Css.css` returned **0 bytes** until this was fixed (and **36 256 bytes** after).
Fix: escape the two unescaped `<video>` tokens to `&lt;video&gt;`. Before:
```
on top of <video> as opaque black -> visually black despite <video>
```
After:
```
on top of &lt;video&gt; as opaque black -> visually black despite &lt;video&gt;
```
XML now passes `xmllint --noout` cleanly. Same fix applied to dev simultaneously — both branding.xml files now have md5 `<see config>` and parse identically.
This single character-level escape is what restored streaming on prod. The doc-28 chain (Traefik SW pin, INC7 transparent CSS) was technically correct upstream — the diagnosis was right, but the *delivery* was broken because the XML never loaded. INC7's CSS rule had been "in" `branding.xml` since 2026-05-09 02:46Z, but `Branding/Css.css` was empty so the rule never reached any browser.
**Lesson:** add `xmllint --noout branding.xml` to deploy CI. The user-visible failure mode of a malformed `BrandingOptions` XML is silent (zero-byte response, no banner, no admin notification), and both prod and dev had been running unthemed-via-CustomCss for multiple deploy cycles before anyone noticed.
---
## Files touched
| Path | Change |
|------|--------|
| `web-overrides/index.html` | Apply `bin/inject-middle-theme.py` — adds 75 KB (wordmark + favicon base64 + style + script + link). Idempotent markers `ARRFLIX-MIDDLE-THEME-BEGIN/END` and `ARRFLIX-FAVICON-BEGIN/END`. md5 `c6c85076951633c434864a0133d602e5`. |
| `web-overrides/assets/arrflix-A.png` | New — 1695×928 PNG of the ARRFLIX "A" mark on white. Source for the favicon (white→transparent + resize to 138×180 → base64 inline). |
| `bin/inject-middle-theme.py` | New — the patch builder. |
| `docs/29-middle-theme-v6-2026-05-09.md` | This doc. |
| **Server-side** `/home/docker/jellyfin/config/config/branding.xml` (prod) | Two `<video>``&lt;video&gt;` escapes. **Not in repo** (config is per-deployment; document the change here). |
| **Server-side** `/home/docker/jellyfin-dev/config/config/branding.xml` (dev) | Same escape. |
---
## Deploy procedure
### Dev
```bash
# Re-run patch builder against dev's overlay (idempotent)
python3 bin/inject-middle-theme.py
scp web-overrides/index.html user@192.168.0.100:/opt/docker/jellyfin-dev/web-overrides/index-dev.html
# Single-file bind mount — no container restart needed
```
### Prod
Prod's overlay file is owned `root:root`, so `ssh user@…` can't write directly. Use a docker-as-root shim:
```bash
docker run --rm --userns=host \
-v /opt/docker/jellyfin/web-overrides:/d:rw \
-v /tmp:/tmp:rw \
alpine sh -c '
apk add --no-cache python3 >/dev/null 2>&1 &&
python3 /tmp/inject-middle-theme.py /d/index.html
'
docker run --rm --userns=host -v /opt/docker/jellyfin/web-overrides:/d:rw \
alpine chown root:root /d/index.html
```
If `branding.xml` was rewritten with new content, also escape any new `<video>` (or any other unescaped `<`) and `xmllint --noout` before restart. Then:
```bash
docker restart jellyfin
# 30s downtime; users will need to refresh
```
### Verify
```bash
docker exec jellyfin curl -s http://127.0.0.1:8096/Branding/Css.css | wc -c # expect ~36 KB
docker exec jellyfin curl -s http://127.0.0.1:8096/web/index.html | grep -c ARRFLIX-MIDDLE-THEME-BEGIN # expect 2
```
Headless visual: run `bin/headless-test-v2.py` against prod with a known user — `darkPct` on the OSD frame should drop from ~100 % (pre-fix) to <10 % (post-fix), per the doc-28 INC7-final lesson.
---
## Account state on dev
Dev jellyfin instance currently hosts a **single account** for theme testing:
| User | Password | Admin | Hidden |
|------|----------|-------|--------|
| `test` | `123` | yes | no |
The 7 mirror accounts (`marco-mirror`, `house-mirror`, `guest-mirror`, `aloy-mirror`, `pet-mirror`, `5-mirror`, `s8n-dev`) were deleted earlier in the session per owner's "replace all" decision. Library content (Movies + TV Shows) was inherited from prod via a one-time `/config` rsync (excluded `data/jellyfin.db`) so dev sees the same titles and metadata as prod.
**Recovery quirk:** `test`'s password gets nuked occasionally after `docker cp jellyfin.db` operations because `userns_mode: host` flips ownership back to host uid 101000 (the userns-remap of container 1000). Recovery cycle:
```bash
docker stop jellyfin-dev
docker cp jellyfin-dev:/config/data/jellyfin.db /tmp/r.db
docker cp jellyfin-dev:/config/data/jellyfin.db-wal /tmp/r.db-wal 2>/dev/null
sqlite3 /tmp/r.db 'PRAGMA wal_checkpoint(TRUNCATE); UPDATE Users SET Password=NULL, InvalidLoginAttemptCount=0 WHERE Username="test";'
docker cp /tmp/r.db jellyfin-dev:/config/data/jellyfin.db
docker exec --user 0 jellyfin-dev sh -c 'rm -f /config/data/jellyfin.db-wal /config/data/jellyfin.db-shm; chown 1000:1000 /config/data/jellyfin.db'
docker restart jellyfin-dev && sleep 9
# Authenticate with blank password, then POST /Users/{id}/Password { "CurrentPw":"", "NewPw":"123" }
```
User ID for `test`: `a0ea2751d4e2467cb634485614a959e8`.
---
## Open follow-ups
| Item | Where |
|------|-------|
| `compose-dev/docker-compose.yml` in repo lacks the overlay bind-mount that the live host has | `compose-dev/docker-compose.yml` |
| Dev's `system.xml` has `QuickConnectAvailable=true`, prod has `false` — Quick Connect button visible on dev login only | `system.xml` line ~7 |
| Locale-en-only chunk JS files (`*-json.*.chunk.js`) bind-mounted on prod (94 of them) but absent on dev → dev users get stock locale strings | host `/opt/docker/jellyfin/web-overrides/locale-en-only/` |
| Movies/Shows pages on dev show a stuck spinner because Jellyfin's `tryRestoreView` bounces a cached `?topParentId=movies` URL → `/Items/movies` 400. Not a v6 regression — present in stock build too. | Jellyfin `viewContainer.tryRestoreView` |
| Add `xmllint --noout branding.xml` to repo CI | new |
| Headless `darkPct` assertion to surface CSS-overlay-over-video regressions automatically | `bin/headless-test-v2.py` |
---
## Snapshot
| Asset | md5 |
|-------|-----|
| `web-overrides/index.html` (post-v6) | `c6c85076951633c434864a0133d602e5` |
| `branding.xml` (prod, post-escape) | (see live config) |
| `branding.xml` (dev, post-escape) | (see live config) |
| `arrflix-A.png` (asset source) | (see repo) |
Both deploy targets running `c6c85076951633c434864a0133d602e5` as of 2026-05-09 ~03:00 UTC.

View file

@ -1,147 +0,0 @@
# 2026-05-08 — YouTube import: Sassy the Sasquatch (2022)
## Source
| Field | Value |
|---|---|
| Upstream platform | YouTube |
| Channel name | THE BIG LEZ SHOW OFFICIAL |
| Channel id | `UCV1G6JkQtB2nobFm3MGNsBQ` |
| Playlist title | `SASSY THE SASQUATCH` |
| Playlist id | `PLGMC7oz7XpmDMGrALMQiNXCi9p7aqkWbj` |
| Playlist URL | `youtube.com/playlist?list=PLGMC7oz7XpmDMGrALMQiNXCi9p7aqkWbj` (no clickable link by policy) |
## Date imported
`2026-05-08`
## Episodes imported
| # | YouTube id | YouTube title | Canonical filename | Jellyfin episode id | Resolution | Size (bytes) |
|---|---|---|---|---|---|---|
| 1 | `9OmR0ypCyOU` | `SASSY THE SASQUATCH \| EP01 \| SEEN A DINOSAUR` | `Sassy the Sasquatch (2022)/Season 01/Sassy the Sasquatch (2022) - S01E01 - Seen a Dinosaur.mkv` | `2ef02448506b543f39b9372c2b0cdef2` | 1920x1080 | 36,434,608 |
| 2 | `tvCUmH92HfU` | `SASSY THE SASQUATCH \| EP02 \| WATER YOU TALKINABEET` | `Sassy the Sasquatch (2022)/Season 01/Sassy the Sasquatch (2022) - S01E02 - Water You Talkinabeet.mkv` | `3b7809d6840e5ee230fbba951fee227e` | 1920x1080 | 28,928,772 |
| 3 | `QvIgmc2G6lk` | `SASSY THE SASQUATCH \| EP03 \| WALKABEET` | `Sassy the Sasquatch (2022)/Season 01/Sassy the Sasquatch (2022) - S01E03 - Walkabeet.mkv` | `4a0188c67c1d6d08000997177143d6f2` | 1920x1080 | 22,663,417 |
| 4 | `RU9zuIqPcJw` | `SASSY THE SASQUATCH \| EP04 \| AREA 51` | `Sassy the Sasquatch (2022)/Season 01/Sassy the Sasquatch (2022) - S01E04 - Area 51.mkv` | `3dc35c7341f4476e6218840dceb63163` | 1920x1080 | 30,784,810 |
| 5 | `vUyJq1kd-bc` | `SASSY THE SASQUATCH \| EP05 \| SNOW WORRIES` | `Sassy the Sasquatch (2022)/Season 01/Sassy the Sasquatch (2022) - S01E05 - Snow Worries.mkv` | `f79032d3bfbed6371e11375bdfc1b8a6` | 1920x1080 | 36,913,187 |
| 6 | `bi_HbwZDdPg` | `SASSY THE SASQUATCH \| EP06 \| AS ABOING SO BADOING` | _(not imported)_ | _(n/a)_ | _(n/a)_ | 0 |
**Imported: 5 / 6.** EP06 is age-restricted on YouTube and the sibling
downloader had no authenticated cookie store — see Known caveats.
## Naming rules applied
Cited section numbers refer to files under `docs/`:
- **`docs/05` §0 rule 5 — mandatory `(Year)`.** Year `2022` (from manifest)
appended to the show folder and every episode basename, even though the
show name is unique in the library.
- **`docs/05` §2 + `docs/08` §1.2 — TV canonical layout.** Final layout is
`Show (Year)/Season NN/Show (Year) - SxxEyy - Title.ext`. Applied
literally across all 5 episodes.
- **`docs/05` §2.1 + `docs/08` §4.2 — zero-padded `Season 01`.** All five
files placed under `Season 01/` (single-playlist → single-season default).
- **`docs/08` §1.2 — zero-padded two-digit episode marker.** `S01E01`
through `S01E05`, never `S1E1` / `1x01` / `Ep1`.
- **`docs/08` §2.1 + §3.4 — strip channel-pollution prefix and
bracket/id tags.** The repeated `SASSY_THE_SASQUATCH_`, the `EPxx_`
fragment, and the `[<videoId>]` suffix in raw filenames were all
removed; only the human-readable episode title survives in the
canonical name.
- **`docs/08` §2.5 — underscores → spaces.** `WATER_YOU_TALKINABEET`
`Water You Talkinabeet`; `AREA_51``Area 51`; `SEEN_A_DINOSAUR`
`Seen a Dinosaur`; `SNOW_WORRIES``Snow Worries`; `WALKABEET` left
as a single word.
- **`docs/08` §5.1 — smart title case.** All-uppercase YouTube titles
recased; small word `a` lowercased mid-title (`Seen a Dinosaur`); the
number `51` preserved as-is (`Area 51`).
- **`docs/05` §0 rule 3 / `docs/08` §5.5 — ASCII-only, no forbidden
characters.** Source titles contained no `< > : " / \ | ? *`, so no
substitutions were needed; verified post-rename.
- **`docs/08` §1.1 / §1.2 forbidden list — no resolution/codec/group
tags.** `1080p`, `av01`, `opus`, `WEB-DL`, etc. all stripped; canonical
names contain title only.
## rsync stats
```
sent 155,763,373 bytes
received 126 bytes
throughput 44,503,856 bytes/sec (~42.4 MiB/s)
total size 155,724,794 bytes
speedup 1.00 (initial transfer, no rolling-checksum reuse)
flags -av --partial --append-verify
files 5 transferred + 2 dirs created
```
## Jellyfin verification
| Field | Value |
|---|---|
| Series id | `b2d1afd8a4a30c59adb42ccaf47376c2` |
| Series name | `Sassy the Sasquatch` |
| ProductionYear | `2022` |
| Path | `/media/tv/Sassy the Sasquatch (2022)` |
| Provider ids | `Tmdb=321760 Imdb=tt21209936 Tvdb=421839` |
| Episode count | 5 |
| Direct-play (E01 sample) | `SupportsDirectPlay=True`, `SupportsDirectStream=True`, `SupportsTranscoding=False`, video `av1` profile Main, audio `opus` |
| Library scan | `RefreshLibrary` reached `Idle / Completed` < 10 s after `POST /Library/Refresh`; full series refresh queued via `POST /Items/.../Refresh?Recursive=true&MetadataRefreshMode=FullRefresh&ImageRefreshMode=FullRefresh` (HTTP 204) |
All 5 episodes mapped to correct `SxxEyy` via filename → Jellyfin parser
(no manual identify required).
## Known caveats
1. **EP06 age-restricted, not imported.** YouTube requires authenticated
cookies for `bi_HbwZDdPg` (`AS ABOING SO BADOING`); sibling downloader
ran without a logged-in browser session. Library currently shows
5/6 episodes. Operator's call whether to retry with cookies or
accept the gap.
2. **Jellyfin display titles uppercase.** TVDB has the official episode
titles entered in all-caps (`SEEN A DINOSAUR`, etc.) and Jellyfin
prefers provider data over filename-derived titles. Filenames on disk
remain canonical-cased per `docs/08` §5.1; only the API/UI display
layer follows TVDB. Not a bug.
3. **No subtitles.** Source has YouTube auto-generated captions only;
sibling did not pull them, so no `.srt` / `.ass` siblings exist in
the library.
4. **Cross-device hard-link failure during staging.** Sibling's first
staging attempt at `/tmp/yt-norm/staged/` failed (`/tmp` is `tmpfs`,
source is on LUKS+ext4); re-staged at `/home/admin/yt-norm-staged/`
on the same FS. Net disk impact: zero (inode-only links). Cosmetic.
5. **Brief said "6 episodes," manifest had 6 with 1 failed.** Reconciled
to actual 5 successful downloads; counts in this log reflect reality,
not the brief.
## Source manifest copied
Top-level fields of `manifest.json` (sibling agent's output, originally at
`/home/admin/yt-import-staging/manifest.json` on onyx):
```json
{
"show_name": "Sassy the Sasquatch",
"year": "2022",
"channel": "THE BIG LEZ SHOW OFFICIAL",
"channel_id": "UCV1G6JkQtB2nobFm3MGNsBQ",
"playlist_id": "PLGMC7oz7XpmDMGrALMQiNXCi9p7aqkWbj",
"playlist_title": "SASSY THE SASQUATCH",
"raw_dir": "/home/admin/yt-import-staging/raw",
"downloaded_count": 5,
"failed_count": 1,
"total_bytes": 155724794
}
```
The full per-episode `episodes[]` array is **not** embedded here. Trace
each episode by canonical path under the live tree on nullstone:
```
/home/user/media/tv/Sassy the Sasquatch (2022)/
└── Season 01/
├── Sassy the Sasquatch (2022) - S01E01 - Seen a Dinosaur.mkv
├── Sassy the Sasquatch (2022) - S01E02 - Water You Talkinabeet.mkv
├── Sassy the Sasquatch (2022) - S01E03 - Walkabeet.mkv
├── Sassy the Sasquatch (2022) - S01E04 - Area 51.mkv
└── Sassy the Sasquatch (2022) - S01E05 - Snow Worries.mkv
```

View file

@ -1,42 +0,0 @@
# IMPORT-LOG
Append-only ledger of media-import events for ARRFLIX. Each file in this
directory documents a single import operation so future audits can trace
content on disk back to its provenance.
## Convention
- **One file per import event.**
- **Filename format:** `YYYY-MM-DD-<source>-<show-slug>.md`
- `<source>` is the upstream type (`youtube`, `tvdb-rip`, `bluray`, `manual`, etc.).
- `<show-slug>` is the lowercase show name with `-` separators.
- Example: `2026-05-08-youtube-sassy-the-sasquatch.md`.
- **Date** is the day the import landed on the live library (not the day the
source was published).
## Required sections
Each entry should include:
1. **Source** — upstream URL/handle, channel/distributor, identifiers.
2. **Date imported** — ISO-8601.
3. **Episodes imported** — table mapping source identifier → canonical
filename → Jellyfin item id (resolution + size where applicable).
4. **Naming rules applied** — citations of the `docs/0X.md` rule numbers
invoked, with a short note on the specific edit each made.
5. **rsync stats**`sent / received / speedup` from the transfer.
6. **Jellyfin verification** — series id, episode count, direct-play status.
7. **Known caveats** — anything that wasn't perfect (missing metadata,
failed thumbnails, age-restricted episodes, etc.).
8. **Source manifest copied** — top-level fields of the operator's
`manifest.json`. Do **not** embed the full `episodes[]` array — link to
the canonical files in the live library instead.
## Boundaries
- **No access tokens, API keys, or session cookies** in any log file.
- For YouTube imports, **do not include full video URLs** — record the bare
`videoId` only, so import logs are not indexable as a re-distribution
pointer.
- Logs are **immutable once committed**. To correct a mistake, append a
follow-up entry rather than rewriting history.

View file

@ -33,7 +33,7 @@ User IDs captured:
```bash
SNAP=/tmp/ARRFLIX/snapshots/2026-05-08-pre-elegantfin
TOKEN="<JELLYFIN_API_TOKEN>"
TOKEN="*redacted*"
BASE="https://arrflix.s8n.ru"
```
@ -76,7 +76,7 @@ done
```bash
SNAP=/tmp/ARRFLIX/snapshots/2026-05-08-pre-elegantfin
TOKEN="<JELLYFIN_API_TOKEN>"
TOKEN="*redacted*"
BASE="https://arrflix.s8n.ru"
# 1. branding

View file

@ -1,137 +0,0 @@
# English Lockdown — Web-side Shim
> Browser-side belt for the per-user `UICulture` pin documented in
> `docs/15-force-english.md`. The server-side POST sets the authoritative
> value; this shim removes every escape hatch the SPA exposes to the user
> so they can't unpin it from the browser.
Last verified: 2026-05-08 against Jellyfin 10.10.3 web bundle, arrflix.s8n.ru.
---
## What this does
The English-lockdown logic lives **inside the existing ARRFLIX runtime
shim** (one self-contained IIFE per `docs/10-spa-runtime-shim.md`
"the shim must remain self-contained"). Source of truth:
`bin/inject-shim.py`. Compiled output lands in `web-overrides/index.html`
between the `ARRFLIX-SHIM-BEGIN` / `-END` markers.
It runs **synchronously, before the Jellyfin bundle parses**, and pins
the UI to English by:
| # | Mechanism | Why |
|---|-----------|-----|
| 1 | `localStorage.setItem` for `appLanguage`, `selectedlanguage`, `selectedlocale`, `language`, `locale`, `culture` → all set to `en-US` | Belt-and-braces; covers every key Jellyfin web has shipped under across versions. |
| 2 | `Object.defineProperty(Navigator.prototype, 'language' / 'languages')` returning `'en-US'` / `['en-US','en']` | Jellyfin's pre-auth bundle reads `navigator.language` to pick the splash translation file. Overriding the prototype getter beats the bundle to it. |
| 3 | `fetch` wrapper — strips `Accept-Language` header on outbound requests; rewrites `POST /Users/{id}/Configuration` body to force `UICulture: 'en-US'` before send | Defensive: even if a future Jellyfin build offers a "save language" UI we don't know about, the POST gets rewritten in-flight. The user can't opt out. |
| 4 | `XMLHttpRequest` wrapper — same Accept-Language strip + same Configuration POST rewrite | Older Jellyfin bundle code paths use XHR rather than fetch. Belt for the fetch suspenders. |
| 5 | `pinLocale()` re-runs on every `start()` call AND on the existing 1s setInterval safety net | Re-pin storage keys if the SPA tries to clear/rewrite them. |
A companion CSS block in the existing critical-path `<style>` tag (top of
`web-overrides/index.html`) hides every language-switcher widget in the
UI: profile prefs dropdown, login page locale picker, header userMenu
locale flag, and (via `:has()`) any future `<select>` that contains a
`de-DE`/`fr-FR`/`es-ES` option.
## Where it gets injected from
```
bin/inject-shim.py # source of truth for the JS shim (run after edits)
web-overrides/index.html # the IIFE lives here, between ARRFLIX-SHIM-BEGIN/-END
# the CSS hide rules live in the <style> at the top
```
Container bind-mount (compose, unchanged from `docs/10`):
```yaml
volumes:
- /opt/docker/jellyfin/web-overrides/index.html:/jellyfin/jellyfin-web/index.html:ro
```
## Deploy workflow
```bash
# 1. Edit bin/inject-shim.py (NOT the IIFE inside index.html directly)
# 2. Re-run injector locally
python3 bin/inject-shim.py
# 3. scp to nullstone
scp web-overrides/index.html user@192.168.0.100:/opt/docker/jellyfin/web-overrides/index.html
# 4. Hard-refresh a browser. No container restart needed (single-file bind mount).
```
CSS-only edits (the `<style>` block at the very top of `index.html`)
are edited directly — the injector only owns the `<script>` IIFE.
## Verification (operator runs after deploy)
In a fresh incognito browser at `https://arrflix.s8n.ru`, open DevTools
console and run:
```js
localStorage.getItem('appLanguage') // expect: "en-US"
localStorage.getItem('selectedlanguage') // expect: "en-US"
navigator.language // expect: "en-US"
navigator.languages // expect: ["en-US", "en"]
```
Curl-side the file is loading the new shim:
```bash
curl -ks https://arrflix.s8n.ru/web/index.html | grep -c english-lockdown
# expect: 2 (one in CSS comment header, one in JS comment header)
curl -ks https://arrflix.s8n.ru/web/index.html | grep -c pinLocale
# expect: 4 (definition + 3 call-sites: start, setInterval, comment)
```
UI-side:
- [ ] User profile prefs page shows no "Display Language" dropdown.
- [ ] Login page shows no language picker.
- [ ] Header userMenu shows no locale flag/text.
- [ ] After auth, every Play / Resume / Settings label is English even
from a browser sending `Accept-Language: de-DE,de;q=0.9`.
## Known limitations
1. **First-paint flash on cold load.** The pre-bundle splash strings
(login form: "Sign In" / "Username" / "Password") are loaded by the
bundle from a static JSON file based on the browser's locale-detection
fallback BEFORE the IIFE's `Object.defineProperty` can intercept.
Modern Chromium / Firefox respect the prototype redefinition fast
enough that this is sub-50ms in practice — but on a slow connection
you may briefly see German login labels before the English bundle
replaces them. Acceptable; matches the existing first-paint flash
caveat in `docs/10`.
2. **`Accept-Language` strip is best-effort.** Most browsers prevent JS
from removing or modifying the `Accept-Language` request header on
outbound `fetch` / XHR. The wrapper attempts the delete; if the
browser silently ignores it, no harm done — the per-user `UICulture`
pin (server-side, see `docs/15`) wins regardless.
3. **`Object.defineProperty` may fail on some embedded WebViews** that
freeze `Navigator.prototype`. The shim has a fallback that retries
on the `navigator` instance directly. If both fail, the navigator
getters still return browser values, but the localStorage pin and
the user-config-save rewrite still hold the line.
4. **CSS `:has()`** has the same Chromium 105+ / Firefox 121+ /
Safari 15.4+ floor as the existing drawer-Settings rules. On older
browsers the `option[value="de-DE"]`-conditional hide degrades
silently — the simpler `select[name="language"]` rules still hide
the standard dropdown.
## Why this is layered with the per-user `UICulture` POST
The server-side fix (`docs/15-force-english.md`) is the authoritative
mechanism: when a user has `Configuration.UICulture = "en-US"`, the SPA
honours it on every login. This shim exists because:
- New users created outside the wrapper might land without the pin.
- A future Jellyfin web release might add a "change language" affordance
inside the player or a settings deeplink we haven't audited.
- The pre-auth splash bundle ignores `UICulture` (the user isn't logged
in yet) and reads `navigator.language` directly.
CSS hide + JS lockdown belt; per-user POST suspenders. Both are needed.

File diff suppressed because one or more lines are too long

Binary file not shown.

Before

Width:  |  Height:  |  Size: 517 KiB

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long