Job Queue implementation using Google Cloud Tasks to push jobs to your worker instance
npm install @pinelab/vendure-plugin-google-cloud-tasks
Plugin for using Vendure worker with Google Cloud Tasks. This plugin will show ending, successful and failed jobs in the admin UI under sytem/jobs
, but not running jobs. Only jobs of the past 7 days are kept in the DB.
DefaultJobQueuePlugin
from your vendure-config. Add this plugin to your vendure-config.ts
:JobRecordBuffer
table.Products > (cog icon) > reindex
to test the Cloud Tasks Plugin.This plugin installs the SQLJobBufferStrategy
from Vendure's default JobQueue plugin, to buffer jobs in the database. This is because most projects that are using Google Cloud Tasks will also have multiple instances of the Vendure server.
You can call the endpoint /cloud-tasks/clear-jobs/X
with the secret as Auth header to clear jobs older than X days. For example:
Will clear all jobs older than 1 day.
When pushing multiple tasks concurrently to a queue in serverless environments, you might see DEADLINE_EXCEEDED
errors. If that happens, you can instantiate the plugin with fallback: true
to make the Google Cloud Tasks client fallback to HTTP instead of GRPC. For more details see https://github.com/googleapis/nodejs-tasks/issues/397#issuecomment-618580649
This means the Job data is larger than NestJS's configured request limit. You can set a large limit in your vendure-config.ts
:
We don't include this in the plugin, because it affects the entire NestJS instance
ER_OUT_OF_SORTMEMORY: Out of sort memory, consider increasing server sort buffer size
on MySQLIf you get this error, you should create an index on the createdAt
column of the job table:
The error is caused by the fact that the job_record.data
column is a json
column and can contain a lot of data. More information can be found here: https://stackoverflow.com/questions/29575835/error-1038-out-of-sort-memory-consider-increasing-sort-buffer-size