Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Unexpected Behavior of ShouldBeUniqueUntilProcessing and WithoutOverlap in Jobs #48013

Closed
buffalom opened this issue Aug 9, 2023 · 6 comments

Comments

@buffalom
Copy link

buffalom commented Aug 9, 2023

Laravel Version

10.18.0

PHP Version

8.2.3

Database Driver & Version

No response

Description

A job that is released back to the queue (e.g. by WithoutOverlap) does not acquire the UniqueLock and therefore does not prevent new jobs implementing the ShouldBeUnique interface from being added to the queue.

Context

We have a job that runs for ~10 seconds, goes through the content of our app and updates something in a cache. This job is triggered by various event listeners and therefore is not run in a regular interval.
Because the job updates the cache for all content, there only ever needs to be one job in the queue to keep the cache up to date. This is why we updated our job to implement the ShouldBeUnique interface. However, this can result in an outdated cache, because no new job is pushed to the queue if a job is already being processed.
That's why we wanted to update the job to implement the ShouldBeUniqueUntilProcessing interface and the WithoutOverlap middleware. That way, a new job would be added to the queue if one is already being processed, but no two jobs would be processed simultaneously, because that could lead to issues. Instead, they would be released back to the queue.
It seems that ShouldBeUniqueUntilProcessing and WithoutOverlap do not work together as we expect it to.

Details

TestJob takes about 10 seconds to process and we have two workers running (using horizon).

Expected flow
  1. TestJob 1 is dispatched and added to the queue
  2. TestJob 1 is being processed by worker 1
  3. TestJob 2 is dispatched and added to the queue
  4. TestJob 2 is popped by worker 2 and released back to the queue
  5. TestJob 3 is dispatched and should be discarded, because the released TestJob 2 is already in the queue
  6. TestJob 2 is popped by a worker and processed (when TestJob 1 has finished processing)
Actual flow
  1. TestJob 1 is dispatched and added to the queue
  2. TestJob 1 is being processed by worker 1
  3. TestJob 2 is dispatched and added to the queue
  4. TestJob 2 is popped by worker 2 and released back to the queue
  5. TestJob 3 is dispatched and added to the queue
  6. TestJob 3 is being processed by worker 2
  7. TestJob 2 is popped by worker 1 and processed (when TestJob 1 has finished processing)
Question

Is the actual behaviour intended and our expectation is wrong, or is this an actual issue?

Steps To Reproduce

We have horizon running with two processes.

Our `TestJob`
class TestJob implements ShouldQueue, ShouldBeUniqueUntilProcessing
{
    use Dispatchable, InteractsWithQueue, Queueable, SerializesModels;

    public $tries = 0;

    public function __construct(
        public string $jobId
    ) {
    }

    public function middleware(): array
    {
        return [(new WithoutOverlapping())->releaseAfter(8)];
    }

    /**
     * Execute the job.
     */
    public function handle(): void
    {
        sleep(10);
    }

    public function getJobId(): string
    {
        return $this->jobId;
    }

    public function tags()
    {
        return [$this->getJobId(), "test-job:{$this->getJobId()}"];
    }
}
Additional event listeners to debug job processing
Event::listen(JobProcessing::class, function (JobProcessing $event) {
    $jobName = class_basename($event->job->resolveName());
    logger()->debug("{$jobName} {$event->job->payload()['tags'][0]} processing");
});

Event::listen(JobFailed::class, function (JobFailed $event) {
    $jobName = class_basename($event->job->resolveName());
    logger()->debug("{$jobName} {$event->job->payload()['tags'][0]} failed");
});

Event::listen(JobPopped::class, function (JobPopped $event) {
    $jobName = class_basename($event->job->resolveName());
    logger()->debug("{$jobName} {$event->job->payload()['tags'][0]} popped from queue");
});

Event::listen(JobProcessed::class, function (JobProcessed $event) {
    $jobName = class_basename($event->job->resolveName());
    logger()->debug("{$jobName} {$event->job->payload()['tags'][0]} processed");
});

Event::listen(JobQueued::class, function (JobQueued $event) {
    $jobName = class_basename($event->job);
    logger()->debug("{$jobName} {$event->job->getJobId()} queued");
});

Event::listen(JobReleasedAfterException::class, function (JobReleasedAfterException $event) {
    $jobName = class_basename($event->job->resolveName());
    logger()->debug("{$jobName} {$event->job->payload()['tags'][0]} released after exception");
});

Event::listen(JobRetryRequested::class, function (JobRetryRequested $event) {
    $jobName = class_basename($event->job);
    logger()->debug("{$jobName} {$event->payload()['tags'][0]} requested a retry");
});

Event::listen(JobTimedOut::class, function (JobTimedOut $event) {
    $jobName = class_basename($event->job->resolveName());
    logger()->debug("{$jobName} {$event->job->payload()['tags'][0]} timed out");
});

Test

Route::get('/issue', function () {
    // should be started after 2 seconds
    dispatch((new TestJob('01'))->delay(now()->addSeconds(2)));

    sleep(6);

    // should be started after job 1 is completed
    dispatch(new TestJob('02'));

    sleep(8);

    // should not be started because job 2 is still in the queue and not processing
    dispatch(new TestJob('03'));

    return 'Jobs dispatched '.now()->toDateTimeString();
});

Issue: Job 2 is released back to the queue. Job 3 however is still added to the queue, even though job 2 is now in the queue (to be processed after 8 seconds) again.

Log

[2023-08-09 12:12:43] local.DEBUG: TestJob 01 queued
[2023-08-09 12:12:45] local.DEBUG: TestJob 02 processed
[2023-08-09 12:12:45] local.DEBUG: TestJob 01 popped from queue
[2023-08-09 12:12:45] local.DEBUG: TestJob 01 processing
[2023-08-09 12:12:49] local.DEBUG: TestJob 02 queued
[2023-08-09 12:12:50] local.DEBUG: TestJob 02 popped from queue
[2023-08-09 12:12:50] local.DEBUG: TestJob 02 processing
[2023-08-09 12:12:50] local.DEBUG: TestJob 02 processed
[2023-08-09 12:12:55] local.DEBUG: TestJob 01 processed
[2023-08-09 12:12:57] local.DEBUG: TestJob 03 queued
[2023-08-09 12:12:58] local.DEBUG: TestJob 03 popped from queue
[2023-08-09 12:12:58] local.DEBUG: TestJob 03 processing
[2023-08-09 12:12:59] local.DEBUG: TestJob 02 popped from queue
[2023-08-09 12:12:59] local.DEBUG: TestJob 02 processing
[2023-08-09 12:12:59] local.DEBUG: TestJob 02 processed
[2023-08-09 12:13:08] local.DEBUG: TestJob 03 processed
[2023-08-09 12:13:08] local.DEBUG: TestJob 02 popped from queue
[2023-08-09 12:13:08] local.DEBUG: TestJob 02 processing
[2023-08-09 12:13:18] local.DEBUG: TestJob 02 processed
@crynobone
Copy link
Member

I believe this may be a niche use case where both ShouldBeUniqueUntilProcessing and WithoutOverlap behavior wasn't designed to complement each other but this maybe an improvement that can be added.

@github-actions
Copy link

Thank you for reporting this issue!

As Laravel is an open source project, we rely on the community to help us diagnose and fix issues as it is not possible to research and fix every issue reported to us via GitHub.

If possible, please make a pull request fixing the issue you have described, along with corresponding tests. All pull requests are promptly reviewed by the Laravel team.

Thank you!

@santas156
Copy link

Same here, found this Unexpected Behavior last week in my project too

@alfonsogarza
Copy link

I believe this may be a niche use case where both ShouldBeUniqueUntilProcessing and WithoutOverlap behavior wasn't designed to complement each other but this maybe an improvement that can be added.

Could you elaborate on your reasoning? I'm trying to understand if we should reevaluate if we need to be using both.

From our perspective using WithoutOverlap increases the need for, or usefulness of using ShouldBeUniqueUntilProcessing because jobs are more likely to be queued.

@driesvints
Copy link
Member

Closing this issue because it's inactive, already solved, old or not relevant anymore. Feel to open up a new issue if you're still experiencing this.

@alfonsogarza
Copy link

@crynobone when you have time could you reply to my comment above please?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

5 participants