Deezer cover art: upgrade CDN URL to 1900×1900 (was embedding 1000×1000)

Discord report (Tim): downloaded cover art via Deezer metadata
source came out visibly blurry in Navidrome / on phones — large
displays exposed the limited resolution.

# Cause

Deezer's API returns `cover_xl` URLs at 1000×1000. The underlying
CDN actually serves up to 1900×1900 by rewriting the size segment
in the URL path (same trick the iTunes mzstatic + Spotify scdn
upgrades already use). SoulSync wasn't doing the rewrite — every
Deezer-sourced cover got embedded at 1000×1000 regardless of how
much higher resolution the CDN had available.

# Verified empirically

```
$ for size in 1000 1400 1800 1900 2000; do curl -I "...{size}x{size}-..."; done
1000: 200 OK  106 KB
1400: 200 OK  198 KB
1800: 200 OK  331 KB
1900: 200 OK  371 KB
2000: 403 Forbidden
```

1900 is the safe ceiling. Above that the CDN returns 403. CDN
serves source-native bytes when source < target (smaller-source
albums get same bytes whether we ask for 1000 or 1900), so asking
for 1900 universally is safe.

# Fix

New `_upgrade_deezer_cover_url(url, target_size=1900)` helper in
`core/deezer_client.py`. Pure function, mirrors the
`_upgrade_spotify_image_url` pattern that already lives in
`core/spotify_client.py`. Defensive on every input shape:

- Empty / None → returned as-is
- Non-Deezer URL (no `dzcdn`) → returned as-is
- No size segment in URL → returned as-is
- Already at/above target → returned as-is (idempotent, never
  downgrades)

Applied at both cover-download sites:

- `core/metadata/artwork.py::download_cover_art` — auto post-process
  flow. Mirrors the existing iTunes mzstatic upgrade right above it.
- `core/tag_writer.py::download_cover_art` — enhanced library view's
  "Write Tags to File" feature.

# Scope discipline

- Helper applied at the DOWNLOAD boundary, not the source extraction
  point in `deezer_client.py`. Means cached entries in the metadata
  cache + DB row `image_url` columns keep the original 1000×1000 URL
  Deezer's API returned. Future CDN behavior changes only affect the
  download path, not stored data.
- Pre-existing `prefer_caa_art` toggle (Settings → Library →
  Post-Processing) untouched — orthogonal workaround for users who
  want even higher quality (MusicBrainz Cover Art Archive, often
  3000×3000+).
- iTunes / Spotify upgrade paths untouched — they already worked.

# Tests added (16)

`tests/metadata/test_deezer_cover_url_upgrade.py`:

- Standard upgrade: default target 1900 on cover URL, alternate
  dzcdn host (`e-cdns-images.dzcdn.net` vs `cdn-images.dzcdn.net`),
  artist picture URLs (same path pattern), 500×500 source upgrades
  too
- Custom target size: smaller target = no-op (never downgrade),
  larger target works
- Idempotent: already at/above target returned unchanged
- Defensive on non-Deezer URLs: parametrised across 5 hosts
  (Spotify scdn, iTunes mzstatic, MB CAA, Last.fm, random) — all
  returned untouched
- Defensive on malformed Deezer URL (no size segment) → returned
  as-is
- Empty / None handling

# Verification

- 16/16 helper tests pass
- 560/560 metadata + imports tests pass (no regression)
- 2559 full suite passes
- Ruff clean
pull/542/head
Broque Thomas 4 days ago
parent c6887809da
commit 80cf16339c

@ -45,6 +45,50 @@ def rate_limited(func):
return wrapper
# Pattern matches Deezer's CDN cover/picture URL: a numeric width-x-height
# segment in the path (e.g. ``/1000x1000-000000-80-0-0.jpg``). Captures
# both halves so the replacement can use a single dimension and preserve
# the rest of the path verbatim.
_DEEZER_CDN_SIZE_PATTERN = re.compile(r'/(\d+)x(\d+)-')
# Maximum size Deezer's CDN serves before returning 403. Verified
# empirically against multiple albums — 1900 works reliably, 2000+
# returns Forbidden. CDN serves the source-native size when it's
# smaller than requested, so asking for 1900 is safe even on albums
# whose source upload was lower-res (no upscaling, just same bytes).
_DEEZER_MAX_COVER_SIZE = 1900
def _upgrade_deezer_cover_url(url: str, target_size: int = _DEEZER_MAX_COVER_SIZE) -> str:
"""Rewrite a Deezer CDN cover/picture URL to request a larger size.
Deezer's API returns ``cover_xl`` / ``picture_xl`` URLs at
1000×1000, but the underlying CDN serves up to 1900×1900 by
rewriting the size segment in the URL path. This helper does the
rewrite same idea as ``_upgrade_spotify_image_url`` in
``spotify_client`` and the ``mzstatic.com`` size-replacement in
``download_cover_art``.
Defensive on every input shape:
- Empty / None URL returned as-is
- Non-Deezer URL (no ``dzcdn`` host, no size segment) returned as-is
- Already at or above target size returned as-is (no point rewriting)
The CDN returns the source-native image bytes when source < target,
so asking for 1900 on an album whose source was uploaded at 600
just returns the 600-pixel image no upscaling, no failure.
"""
if not url or 'dzcdn' not in url:
return url
match = _DEEZER_CDN_SIZE_PATTERN.search(url)
if not match:
return url
current = int(match.group(1))
if current >= target_size:
return url
return _DEEZER_CDN_SIZE_PATTERN.sub(f'/{target_size}x{target_size}-', url, count=1)
# ==================== Dataclasses (match iTunesClient / SpotifyClient format) ====================
@dataclass

@ -324,6 +324,20 @@ def download_cover_art(album_info: dict, target_dir: str, context: dict = None):
logger.debug("upgrade spotify image url failed: %s", e)
elif art_url and "mzstatic.com" in art_url:
art_url = re.sub(r"\d+x\d+bb", "3000x3000bb", art_url)
elif art_url and "dzcdn" in art_url:
# Deezer's API returns cover_xl URLs at 1000×1000 but
# the underlying CDN serves up to 1900×1900 by rewriting
# the size segment in the URL path. Without this upgrade
# users embedding cover art via Deezer get visibly
# blurry covers in their library / phone player (Discord
# report from Tim, 2026-05). Same shape as the iTunes
# mzstatic upgrade above + Spotify scdn upgrade.
try:
from core.deezer_client import _upgrade_deezer_cover_url
art_url = _upgrade_deezer_cover_url(art_url)
except Exception as e:
logger.debug("upgrade deezer image url failed: %s", e)
if not art_url:
logger.warning("No cover art URL available for download.")
return

@ -204,9 +204,21 @@ def download_cover_art(cover_url: str) -> Optional[Tuple[bytes, str]]:
"""
Download cover art once. Returns (image_data, mime_type) or None on failure.
Call this once per album, then pass the result to write_tags_to_file for each track.
For Deezer CDN URLs, upgrades the size segment to 1900×1900 (CDN
max). Mirrors the same upgrade in
``core.metadata.artwork.download_cover_art`` so the
enhanced-library-view "Write Tags to File" feature embeds the same
high-resolution cover the auto post-process flow does.
"""
if not cover_url:
return None
if 'dzcdn' in cover_url:
try:
from core.deezer_client import _upgrade_deezer_cover_url
cover_url = _upgrade_deezer_cover_url(cover_url)
except Exception as e:
logger.debug("upgrade deezer image url failed: %s", e)
try:
with urllib.request.urlopen(cover_url, timeout=15) as response:
image_data = response.read()

@ -0,0 +1,140 @@
"""Pin the Deezer CDN cover-URL upgrade helper.
Discord report (Tim, 2026-05-XX): downloaded cover art via Deezer
metadata source comes out blurry visibly low-res in Navidrome.
Cause: Deezer's API returns ``cover_xl`` URLs at 1000×1000 but the
underlying CDN serves up to 1900×1900 by rewriting the size segment
in the URL path. SoulSync wasn't doing the rewrite.
Helper: ``_upgrade_deezer_cover_url(url, target_size=1900)`` pure
function, lifts to one boundary so cover-download sites don't each
re-implement the regex. Tests pin every input shape:
- Standard Deezer URL upgraded to target
- Non-Deezer URL returned unchanged
- Already at/above target returned unchanged (no needless rewrite)
- Empty / None returned as-is
- Custom target applied correctly
- Picture URLs (artist) same path pattern, also upgraded
"""
from __future__ import annotations
import pytest
from core.deezer_client import _upgrade_deezer_cover_url
# ---------------------------------------------------------------------------
# Standard upgrade — the headline case
# ---------------------------------------------------------------------------
class TestUpgradeStandardDeezerUrl:
def test_default_target_1900(self):
url = 'https://cdn-images.dzcdn.net/images/cover/abc123/1000x1000-000000-80-0-0.jpg'
upgraded = _upgrade_deezer_cover_url(url)
assert upgraded == 'https://cdn-images.dzcdn.net/images/cover/abc123/1900x1900-000000-80-0-0.jpg'
def test_alternate_dzcdn_host(self):
"""Both `cdn-images.dzcdn.net` and `e-cdns-images.dzcdn.net`
are valid Deezer CDN hosts. Helper must catch both."""
url = 'https://e-cdns-images.dzcdn.net/images/cover/xyz/1000x1000-000000-80-0-0.jpg'
upgraded = _upgrade_deezer_cover_url(url)
assert '1900x1900' in upgraded
assert upgraded.startswith('https://e-cdns-images.dzcdn.net/')
def test_artist_picture_url_also_upgrades(self):
"""Artist `picture_xl` URLs follow the same `/SIZExSIZE-` path
pattern and the same CDN. Same upgrade applies."""
url = 'https://cdn-images.dzcdn.net/images/artist/hash/1000x1000-000000-80-0-0.jpg'
upgraded = _upgrade_deezer_cover_url(url)
assert '1900x1900' in upgraded
def test_500x500_upgrades(self):
"""Some albums on Deezer only have cover_big (500×500). Helper
upgrades anything below target, not just 1000×1000."""
url = 'https://cdn-images.dzcdn.net/images/cover/abc/500x500-000000-80-0-0.jpg'
upgraded = _upgrade_deezer_cover_url(url)
assert '1900x1900' in upgraded
# ---------------------------------------------------------------------------
# Custom target size
# ---------------------------------------------------------------------------
class TestCustomTargetSize:
def test_smaller_target(self):
"""Caller can request a smaller size for bandwidth-sensitive
cases (mobile, thumbnails, etc.)."""
url = 'https://cdn-images.dzcdn.net/images/cover/abc/1000x1000-000000-80-0-0.jpg'
upgraded = _upgrade_deezer_cover_url(url, target_size=600)
# 1000 already > 600, so this is a no-op — never DOWNGRADE.
assert upgraded == url
def test_larger_target_works(self):
url = 'https://cdn-images.dzcdn.net/images/cover/abc/250x250-000000-80-0-0.jpg'
upgraded = _upgrade_deezer_cover_url(url, target_size=1400)
assert '1400x1400' in upgraded
# ---------------------------------------------------------------------------
# Already-upgraded URLs — no needless rewrite
# ---------------------------------------------------------------------------
class TestAlreadyUpgraded:
def test_already_at_target_returned_unchanged(self):
"""Re-running the upgrade on an already-upgraded URL should
be a no-op. Idempotent important for cached URLs that may
have been rewritten by a previous SoulSync version."""
url = 'https://cdn-images.dzcdn.net/images/cover/abc/1900x1900-000000-80-0-0.jpg'
assert _upgrade_deezer_cover_url(url) == url
def test_above_target_returned_unchanged(self):
"""Defensive: if the URL is somehow LARGER than target, don't
downgrade. Cached URL from a future bigger-target setting,
manual edits, etc."""
url = 'https://cdn-images.dzcdn.net/images/cover/abc/3000x3000-000000-80-0-0.jpg'
assert _upgrade_deezer_cover_url(url) == url
# ---------------------------------------------------------------------------
# Defensive — non-Deezer URLs left untouched
# ---------------------------------------------------------------------------
class TestNonDeezerUrls:
@pytest.mark.parametrize('url', [
'https://i.scdn.co/image/spotify-id-thing', # Spotify
'https://is4-ssl.mzstatic.com/image/100x100bb.jpg', # iTunes
'https://coverartarchive.org/release/abc/front', # MB CAA
'https://lastfm.freetls.fastly.net/i/u/770x0/abc.jpg', # Last.fm
'https://example.com/random.jpg', # Random
])
def test_non_dzcdn_returned_unchanged(self, url):
"""Helper must NOT touch non-Deezer URLs. Mirrors the
defensive check pattern the iTunes and Spotify upgrade
helpers use."""
assert _upgrade_deezer_cover_url(url) == url
def test_dzcdn_url_without_size_segment_returned_unchanged(self):
"""Defensive: if Deezer ever changes URL format, don't crash
return as-is and let the download attempt happen with the
original URL."""
url = 'https://cdn-images.dzcdn.net/images/cover/abc/some-other-format.jpg'
assert _upgrade_deezer_cover_url(url) == url
# ---------------------------------------------------------------------------
# Empty / None inputs
# ---------------------------------------------------------------------------
class TestEmptyInputs:
def test_empty_string(self):
assert _upgrade_deezer_cover_url('') == ''
def test_none(self):
assert _upgrade_deezer_cover_url(None) is None

@ -3416,6 +3416,7 @@ const WHATS_NEW = {
'2.4.3': [
// --- post-release patch work on the 2.4.3 line — entries hidden by _getLatestWhatsNewVersion until the build version bumps ---
{ date: 'Unreleased — 2.4.3 patch work' },
{ title: 'Deezer Cover Art: Embedded Covers No Longer Look Blurry', desc: 'discord report (tim): downloaded cover art via deezer metadata source came out visibly blurry in navidrome and on phones — particularly noticeable on large displays. cause: deezer\'s api returns `cover_xl` urls at 1000×1000 but the underlying cdn serves up to 1900×1900 by rewriting the size segment in the url path. soulsync wasn\'t doing the rewrite — same as iTunes mzstatic and spotify scdn already get upgraded. now `_upgrade_deezer_cover_url` (mirrors `_upgrade_spotify_image_url` pattern) rewrites the cdn url to request 1900×1900 before download. cdn serves source-native size when source < target so asking for 1900 on smaller-source albums returns the same bytes (no upscaling, no failure). applied at both download sites — auto post-process flow + the enhanced library view\'s "write tags to file" feature. existing `prefer_caa_art` toggle in settings → library → post-processing remains as the orthogonal workaround for users who want even higher quality (musicbrainz cover art archive, often 3000×3000+). 16 new tests pin: standard upgrade, alternate dzcdn host, artist picture urls, custom target sizes, idempotency on already-upgraded urls, defensive on non-deezer urls (spotify/itunes/caa/lastfm/random), empty/none handling.', page: 'settings' },
{ title: 'Cross-Script Artist Names No Longer Quarantine Files (Hiroyuki Sawano / 澤野弘之, Сергей Лазарев / Sergey Lazarev)', desc: 'github issue #442 (afonsog6): files where the artist tag was in one script and the expected metadata was in another — japanese kanji `澤野弘之` for `hiroyuki sawano`, cyrillic `сергей лазарев` for `sergey lazarev`, etc. — got quarantined post-download because acoustid verification scored the artist similarity at 0% (the two scripts share no characters). reporter could not even rescue the file via manual import — the import-modal goes through the same verifier and re-quarantined the same file. cause: verifier compared expected vs actual artist with raw `_similarity` and never consulted musicbrainz aliases, even though MB exposes them on every artist record. fix: new `core/matching/artist_aliases.py` pure helper with alias-aware comparison + new `artists.aliases` JSON column populated by the existing MB enrichment worker on every artist match (one extra `inc=aliases` request per artist) + new multi-tier resolver `MusicBrainzService.lookup_artist_aliases` (library DB → cache → live MB) so the verifier finds aliases even for un-enriched artists without thrashing the MB API. verifier resolves aliases ONCE per `verify_audio_file` call and feeds them through three artist comparison sites (best-match scoring, secondary scan when title matches but artist doesn\'t, final fallback scan). reporter\'s exact two cases reproduced as regression tests with stubbed MB service. backward compat: aliases unavailable / MB unreachable → verifier falls back to direct similarity (identical to pre-fix behaviour — never quarantines stricter than today). 70 new tests pin every layer: pure helper (28), service methods (31), verifier integration (11). audited adjacent artist-comparison sites (auto-import single-track id, discovery scoring, matching engine) — left untouched per scope discipline since they aren\'t the user-reported pain.', page: 'downloads' },
{ title: 'Plex: Library Scan Trigger No Longer Fails On Non-English Section Names', desc: 'github issue #535 (adrigzr): plex servers with the music library named anything other than "music" — Música, Musique, Musik, Musica, etc. — got a `Failed to trigger library scan for "Music": Invalid library section: Music` error after every import cycle, and `wishlist.processing` kept reporting "missing from media server after sync" for tracks that DID import correctly because the post-import scan never fired. cause: `trigger_library_scan` and `is_library_scanning` ignored the auto-detected `self.music_library` (correctly populated by `_find_music_library` filtering by `section.type == "artist"`) and called `self.server.library.section(library_name)` with a hardcoded "music" default — raised NotFound on any non-english server. read methods like `get_artists` already routed through `_get_music_sections` so they didn\'t have the bug; this aligns the scan-trigger path with the same resolution. fix: both single-library branches prefer `self.music_library` first, fall back to literal section lookup only when auto-detection hasn\'t run. activity-feed match in `is_library_scanning` also corrected to use the resolved section\'s actual title instead of the unused `library_name` arg — the prior log line read "triggered scan for music" even on Spanish servers. 13 new tests pin: trigger uses auto-detected section across 6 locale variants (Música / Musique / Musik / Musica / 音乐 / موسيقى), backward-compat fallback when music_library is None, explicit library_name kwarg ignored when auto-detected section exists, log line surfaces correct section title, scan-status check uses auto-detected section\'s `refreshing` attr, activity-feed match filters by resolved title (not library_name).', page: 'settings' },
{ title: 'Search For Match: No More Karaoke / Cover / "Originally Performed By" Junk At The Top', desc: 'github issue #534 (radoslav-orlov): typing "dirty white boy" + "foreigner" into the import-modal "search for match" dialog returned karaoke versions, "originally performed by" compilations, and tribute-band cuts ranked above the actual foreigner studio recording in some regions. user had to scroll past 5+ junk results before finding the canonical track. fix: new `core/metadata/relevance.py` helper reranks results locally with cover/karaoke/tribute/re-recorded penalties (multiplier 0.05× — effectively buries) + exact-artist-match boost (1.5×) + variant-tag (live/acoustic/remix/remaster) penalty (0.4×, skipped when user explicitly typed the variant — searching "track (live)" still ranks live versions correctly). applied at the deezer + itunes + spotify search-tracks endpoints so all three sources behave consistently. validated against live deezer api with the actual #534 query: real foreigner head games cut now lands at #1, live versions follow, karaoke / cover / tribute variants drop to positions 11-15. deezer client also gained optional field-scoped query kwargs (`track="X" artist="Y"`) that build deezer\'s advanced search syntax `track:"X" artist:"Y"` for future opt-in callers (e.g. exact-match flows where api-level filtering is more important than ranking) — kept in client but NOT used at the import-modal endpoint after live testing showed the advanced syntax has its own ranking bias (surfaced "(2008 remaster)" instead of the canonical recording). free-text + local rerank is the more reliable combination here. 75 new tests pin every scoring component, pattern detection (13 cover patterns, 11 variant patterns, 3 fields), score composition (real-cut > karaoke > remaster > re-recorded), the issue #534 screenshot reproduced as a regression test, deezer client query construction + free-text fallback safety net.', page: 'import' },

Loading…
Cancel
Save