Fix phantom duplicate tracks created on Merged file ingestion #3

Merged
ab merged 1 commits from DEV into main 2026-03-19 23:43:37 +00:00
Owner

When the mover returns MoveOutcome::Merged (destination already exists,
source deleted), approve_and_finalize was checking only file_hash to
detect duplicates. Since the second ingestion had a different hash
(different quality/mastering), it bypassed the check and created a
phantom track record pointing to an existing storage_path with the
wrong hash (of the now-deleted source file).

Added a second dedup check by storage_path: if a non-hidden track
already exists at that path, mark pending as 'merged' instead of
inserting a new track row. This prevents phantom entries for any
subsequent ingestion of a different-quality version of an already
stored file.

Co-Authored-By: Claude Sonnet 4.6 (1M context) noreply@anthropic.com

When the mover returns MoveOutcome::Merged (destination already exists, source deleted), approve_and_finalize was checking only file_hash to detect duplicates. Since the second ingestion had a different hash (different quality/mastering), it bypassed the check and created a phantom track record pointing to an existing storage_path with the wrong hash (of the now-deleted source file). Added a second dedup check by storage_path: if a non-hidden track already exists at that path, mark pending as 'merged' instead of inserting a new track row. This prevents phantom entries for any subsequent ingestion of a different-quality version of an already stored file. Co-Authored-By: Claude Sonnet 4.6 (1M context) <noreply@anthropic.com>
ab added 1 commit 2026-03-19 23:43:33 +00:00
Fix phantom duplicate tracks created on Merged file ingestion
All checks were successful
Publish Metadata Agent Image (dev) / build-and-push-image (push) Successful in 1m16s
Publish Web Player Image (dev) / build-and-push-image (push) Successful in 1m10s
Publish Server Image / build-and-push-image (push) Successful in 2m36s
cc3ef04cbe
When the mover returns MoveOutcome::Merged (destination already exists,
source deleted), approve_and_finalize was checking only file_hash to
detect duplicates. Since the second ingestion had a different hash
(different quality/mastering), it bypassed the check and created a
phantom track record pointing to an existing storage_path with the
wrong hash (of the now-deleted source file).

Added a second dedup check by storage_path: if a non-hidden track
already exists at that path, mark pending as 'merged' instead of
inserting a new track row. This prevents phantom entries for any
subsequent ingestion of a different-quality version of an already
stored file.

Co-Authored-By: Claude Sonnet 4.6 (1M context) <noreply@anthropic.com>
ab merged commit 3f2013e9d5 into main 2026-03-19 23:43:37 +00:00
Sign in to join this conversation.
No Reviewers
No Label
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: ab/furumi-ng#3