Skip to content

Sometimes a customer gets 2 or 3 of the same mail, sometimes a record is duplicated

At-least-once delivery is normal for webhooks and queues. A dedup key makes sure a second delivery doesn't trigger a second action. Alongside idempotency this is one of the most important workflow patterns.

Try this first

  1. 1Pick a natural unique key from the payload: order-id, event-id, message-id, or a hash of source-id plus timestamp.
  2. 2Store processed keys in a DB table or a key-value store (Airtable, Postgres, Upstash, Make Data Store, n8n Static Data) with TTL of say 30 days.
  3. 3First flow step: check if the key already exists. If yes, log 'duplicate, skipped' and stop. If no, write the key and proceed.
  4. 4Write the key as early as possible in the flow, not at the end. Otherwise a mid-flow crash can still cause double processing.
  5. 5Monitor duplicates per day: a healthy flow sees a few, a spike points to a source bug or replay storm.

When to bring us in

If you need this across multiple flows, a shared dedup table with TTL and index pays off. We can deliver the schema.

See also

None of the above fits?

Describe your situation below. We pass your input plus the steps you already saw to our AI and return tailored next-step advice. If it's too risky to DIY, we'll say so.

Who are you?

For the AI question we need your email and company, so we can follow up if the AI gets stuck, and to prevent abuse.

Limited to 2 questions per hour and 5 per day, kept lean so the AI stays useful. For more, contacting us directly works better for you and us.

Or skip the DIY entirely

Our Managed IT clients do not look these things up. One point of contact, a fixed monthly price, resolved within working hours.