Fix Laravel Job Queue not processing with MaxAttemptsExceededException

Fix issues with the Laravel Job Queue not processing jobs due to max attempts exceeding.

Fix Laravel Job Queue not processing with MaxAttemptsExceededException
12 Jan 2025
|
2 min read

At Streamfinity, we recently ran into an issue where the Job Queue would stop processing dispatched jobs and just fail them with a MaxAttemptsExceededException.

Queue Monitor Logs

The Situation

With our analytics and stream tracking at Streamfinity, we are processing hundreds of thousands data points each day. Presumabily in every analytics assembly job, we are aggregating around 50.000 data points.

So as you can imagine those are long running jobs with non deterministic execution time.

Thus, we need to make sure that only a single job is processing the data at a time. This can be achieved with the WithoutOverlapping middleware. As long as a job is being locked (right before it starts processing) using Laravel atomic locking, each following attempt to process the same job class will result in a MaxAttemptsExceededException.

1public final class SampleJob
2{
3    public function middleware(): array
4    {
5        return [
6            new WithoutOverlapping()
7        ];
8    }
9}

Because our jobs are long-running, errors can occur and jobs will fail. In case of failure or timeout, the lock will not be released and each following attempt of disaptching a job will also fail.

Medis Redis GUI, no expiration on job lock

Solution

This is why you should always define an expiration time after which you can surely assume, the job has silently failed using the expireAfter() method.

 1public final class SampleJob
 2{
 3    public function middleware(): array
 4    {
 5        return [
 6            (new WithoutOverlapping())
 7                ->dontRelease()
 8                ->expireAfter(60 * 5), // in seconds,
 9        ];
10    }
11}

Read more...