Groq Cloud incident

meta-llama/llama-4-scout-17b-16e-instruct Degraded Performance

Minor Resolved View vendor source →

Groq Cloud experienced a minor incident on January 27, 2026 affecting meta-llama/llama-4-scout-17b-16e-instruct, lasting —. The incident has been resolved; the full update timeline is below.

Started
Jan 27, 2026, 01:11 AM UTC
Resolved
Jan 27, 2026, 01:11 AM UTC
Duration
Detected by Pingoru
Jan 27, 2026, 01:11 AM UTC

Affected components

meta-llama/llama-4-scout-17b-16e-instruct

Update timeline

  1. investigating Jan 26, 2026, 11:17 PM UTC

    Status: Investigating We are currently investigating an issue with meta-llama/llama-4-scout-17b-16e-instruct. Users may be experiencing elevated error rates or slow response time. Other models remain operational. Our team is analyzing logs and infrastructure to identify the cause. Affected components meta-llama/llama-4-scout-17b-16e-instruct (Degraded performance)

  2. identified Jan 27, 2026, 12:06 AM UTC

    Status: Identified We have implemented a fix for meta-llama/llama-4-scout-17b-16e-instruct however performance is still degraded. We’re continuing to investigate the models to determine the cause of the issues. Affected components meta-llama/llama-4-scout-17b-16e-instruct (Degraded performance)

  3. monitoring Jan 27, 2026, 12:29 AM UTC

    Status: Monitoring We have implemented a new fix for meta-llama/llama-4-scout-17b-16e-instruct & groq/compound and performance has improved. We’re continuing monitor the models to ensure stability persists. If all remains normal, we will resolve the incident in the next update. Affected components meta-llama/llama-4-scout-17b-16e-instruct (Operational)

  4. resolved Jan 27, 2026, 01:11 AM UTC

    Status: Resolved This incident has been resolved. Thank you for your patience. Affected components meta-llama/llama-4-scout-17b-16e-instruct (Operational)