Skip to content

We have gigabytes of logs but can't search through them

Athena (AWS), Log Analytics KQL (Azure) or Cloud Logging Logs Explorer (GCP) are the right tools. The key is partitioning and a few standard queries you reuse.

Try this first

  1. 1For S3 logs (CloudTrail, ALB, S3): set up Athena, use partition projection on date. Makes queries 10x faster and cheaper.
  2. 2For live-stream logs in CloudWatch Logs: use CloudWatch Logs Insights with fields/filter/stats. Fine up to a few GB per day.
  3. 3For scalable log management consider a SIEM (Sentinel, Chronicle, or Datadog). Above 50 GB/day Athena becomes slower and pricier than a SIEM.
  4. 4Write a handful of standard queries on a wiki page: 'who ran GetSecretValue?', 'which IPs had 4xx?'. Otherwise you reinvent each time.
  5. 5Set cost alarms on your query engine. Athena charges per scanned TB, a runaway query can cost hundreds of euros per hour.

When to bring us in

For real incident response or a pen test where you're asked about a specific user from 2 years ago, an experienced SOC analyst is usually faster than DIY.

See also

None of the above fits?

Describe your situation below. We pass your input plus the steps you already saw to our AI and return tailored next-step advice. If it's too risky to DIY, we'll say so.

Who are you?

For the AI question we need your email and company, so we can follow up if the AI gets stuck, and to prevent abuse.

Limited to 2 questions per hour and 5 per day, kept lean so the AI stays useful. For more, contacting us directly works better for you and us.

Or skip the DIY entirely

Our Managed IT clients do not look these things up. One point of contact, a fixed monthly price, resolved within working hours.