We have TBs of old data rarely needed, costs a lot in storage
Archive tiers (S3 Glacier Deep Archive, Azure Archive, GCS Coldline/Archive) are 80-95 percent cheaper than standard. Note: retrieval costs money and takes hours to days, so only for truly-rare access.
Try this first
- 1Define 'archive': data you read < once per year. Tax records 6 years back, yes. Customer photos 6 months old, no.
- 2Set lifecycle rules: e.g. Standard to Standard-IA after 30 days, to Glacier Instant Retrieval after 90, to Deep Archive after 180.
- 3For Azure Blob: Hot to Cool to Archive with lifecycle management. For GCS: Object Lifecycle Management with SetStorageClass.
- 4Calculate upfront: retrieval typically ranges from $0.0025 (Glacier Deep Archive Bulk) to $0.03 (Glacier Standard) per GB, plus per-request fee. For TB bulk restore that's hundreds of euros quickly.
- 5Test the retrieval process: how do you request a file back, and how long does it really take? Bulk restore in Glacier Deep Archive is 12-48 hours.
When to bring us in
If compliance demands a contractually fixed retrieval time (e.g. within 4 hours), pick a hot tier or dedicated archive solution like Wasabi. A short review avoids a mismatch.
See also
- Everyone logs in with the AWS root accountRoot is for emergencies and billing. Day-to-day work belongs in IAM users or SSO.
- Every developer has AdministratorAccessAdministratorAccess everywhere is convenient now, painful later. Start with role-based policies.
- Everyone has individual IAM users with their own passwordIdentity Center (formerly AWS SSO) links to your IdP and issues temporary credentials per session.
None of the above fits?
Describe your situation below. We pass your input plus the steps you already saw to our AI and return tailored next-step advice. If it's too risky to DIY, we'll say so.
Or skip the DIY entirely
Our Managed IT clients do not look these things up. One point of contact, a fixed monthly price, resolved within working hours.