We have gigabytes of logs but can't search through them
Athena (AWS), Log Analytics KQL (Azure) or Cloud Logging Logs Explorer (GCP) are the right tools. The key is partitioning and a few standard queries you reuse.
Try this first
- 1For S3 logs (CloudTrail, ALB, S3): set up Athena, use partition projection on date. Makes queries 10x faster and cheaper.
- 2For live-stream logs in CloudWatch Logs: use CloudWatch Logs Insights with fields/filter/stats. Fine up to a few GB per day.
- 3For scalable log management consider a SIEM (Sentinel, Chronicle, or Datadog). Above 50 GB/day Athena becomes slower and pricier than a SIEM.
- 4Write a handful of standard queries on a wiki page: 'who ran GetSecretValue?', 'which IPs had 4xx?'. Otherwise you reinvent each time.
- 5Set cost alarms on your query engine. Athena charges per scanned TB, a runaway query can cost hundreds of euros per hour.
When to bring us in
For real incident response or a pen test where you're asked about a specific user from 2 years ago, an experienced SOC analyst is usually faster than DIY.
See also
- Everyone logs in with the AWS root accountRoot is for emergencies and billing. Day-to-day work belongs in IAM users or SSO.
- Every developer has AdministratorAccessAdministratorAccess everywhere is convenient now, painful later. Start with role-based policies.
- Everyone has individual IAM users with their own passwordIdentity Center (formerly AWS SSO) links to your IdP and issues temporary credentials per session.
None of the above fits?
Describe your situation below. We pass your input plus the steps you already saw to our AI and return tailored next-step advice. If it's too risky to DIY, we'll say so.
Or skip the DIY entirely
Our Managed IT clients do not look these things up. One point of contact, a fixed monthly price, resolved within working hours.