Redis-backed deduplication middleware for Taskiq that prevents duplicate tasks from being queued or executed concurrently.
Documentation: https://taskiq-deduplication.d3vyce.fr
Source Code: https://github.com/d3vyce/taskiq-deduplication
uv add taskiq-deduplicationfrom taskiq_redis import ListQueueBroker
from taskiq_deduplication import RedisDeduplicationMiddleware, DuplicateTaskError
broker = ListQueueBroker("redis://localhost:6379").with_middlewares(
RedisDeduplicationMiddleware(redis_url="redis://localhost:6379"),
)
@broker.task
async def send_report(user_id: int) -> None:
...
# First dispatch acquires the lock — succeeds.
await send_report.kiq(user_id=42)
# Second dispatch while the first is queued or running — raises.
try:
await send_report.kiq(user_id=42)
except DuplicateTaskError:
pass # already queued or running- Sender-side deduplication — rejects duplicate tasks at dispatch time via a Redis lock, before they reach the broker.
- Atomic lock release — lock is released on completion or error via a Lua check-and-delete; only the owning task can release its lock.
- Configurable TTL — set a global default or override per task with the
deduplication_ttllabel. - Explicit lock key — pin any task to a fixed Redis key with
deduplication_key, bypassing fingerprint computation entirely. - Partial fingerprint — deduplicate on a subset of kwargs with
deduplication_key_fields, ignoring irrelevant arguments. - Per-task opt-out — disable deduplication for individual tasks with the
deduplicationlabel. - Startup resilience — automatic reconnection with exponential backoff if Redis is unavailable at broker startup.
MIT License - see LICENSE for details.
Contributions are welcome! Please feel free to submit issues and pull requests.