Groq Cloud incident
meta-llama/llama-4-scout-17b-16e-instruct Degraded Performance
Groq Cloud experienced a minor incident on January 27, 2026 affecting meta-llama/llama-4-scout-17b-16e-instruct, lasting —. The incident has been resolved; the full update timeline is below.
Affected components
Update timeline
- investigating Jan 26, 2026, 11:17 PM UTC
Status: Investigating We are currently investigating an issue with meta-llama/llama-4-scout-17b-16e-instruct. Users may be experiencing elevated error rates or slow response time. Other models remain operational. Our team is analyzing logs and infrastructure to identify the cause. Affected components meta-llama/llama-4-scout-17b-16e-instruct (Degraded performance)
- identified Jan 27, 2026, 12:06 AM UTC
Status: Identified We have implemented a fix for meta-llama/llama-4-scout-17b-16e-instruct however performance is still degraded. We’re continuing to investigate the models to determine the cause of the issues. Affected components meta-llama/llama-4-scout-17b-16e-instruct (Degraded performance)
- monitoring Jan 27, 2026, 12:29 AM UTC
Status: Monitoring We have implemented a new fix for meta-llama/llama-4-scout-17b-16e-instruct & groq/compound and performance has improved. We’re continuing monitor the models to ensure stability persists. If all remains normal, we will resolve the incident in the next update. Affected components meta-llama/llama-4-scout-17b-16e-instruct (Operational)
- resolved Jan 27, 2026, 01:11 AM UTC
Status: Resolved This incident has been resolved. Thank you for your patience. Affected components meta-llama/llama-4-scout-17b-16e-instruct (Operational)