Sometimes a customer gets 2 or 3 of the same mail, sometimes a record is duplicated
At-least-once delivery is normal for webhooks and queues. A dedup key makes sure a second delivery doesn't trigger a second action. Alongside idempotency this is one of the most important workflow patterns.
Try this first
- 1Pick a natural unique key from the payload: order-id, event-id, message-id, or a hash of source-id plus timestamp.
- 2Store processed keys in a DB table or a key-value store (Airtable, Postgres, Upstash, Make Data Store, n8n Static Data) with TTL of say 30 days.
- 3First flow step: check if the key already exists. If yes, log 'duplicate, skipped' and stop. If no, write the key and proceed.
- 4Write the key as early as possible in the flow, not at the end. Otherwise a mid-flow crash can still cause double processing.
- 5Monitor duplicates per day: a healthy flow sees a few, a spike points to a source bug or replay storm.
When to bring us in
If you need this across multiple flows, a shared dedup table with TTL and index pays off. We can deliver the schema.
See also
- n8n: self-host or cloud?Self-hosted is cheaper at volume and keeps data local. Cloud removes ops burden.
- Zapier or Make: which fits better?Zapier is straight-line; Make handles complex flows with routers and iterators for less money.
- Power Automate Cloud or Desktop: which to use?Cloud for SaaS integrations and triggers. Desktop for RPA against legacy Windows apps without APIs.
None of the above fits?
Describe your situation below. We pass your input plus the steps you already saw to our AI and return tailored next-step advice. If it's too risky to DIY, we'll say so.
Or skip the DIY entirely
Our Managed IT clients do not look these things up. One point of contact, a fixed monthly price, resolved within working hours.