Google Cloud Tasks

Google Cloud Tasks

Job Queue implementation using Google Cloud Tasks to push jobs to your worker instance

npm install @pinelab/vendure-plugin-google-cloud-tasks
Latest version2.0.1
Compatibility>=2.2.0
Last publishedAug 14, 2025
Pinelab
PinelabWebshops for mission-driven brands and wholesalers

Official documentation here

Plugin for using Vendure worker with Google Cloud Tasks. This plugin will show ending, successful and failed jobs in the admin UI under sytem/jobs, but not running jobs. Only jobs of the past 7 days are kept in the DB.

Getting started

Plugin setup

  1. Remove DefaultJobQueuePlugin from your vendure-config. Add this plugin to your vendure-config.ts:
  1. Run a database migration to add the JobRecordBuffer table.
  2. Start the Vendure server, log in to the admin dashboard and trigger a reindex job via Products > (cog icon) > reindex to test the Cloud Tasks Plugin.

This plugin installs the SQLJobBufferStrategy from Vendure's default JobQueue plugin, to buffer jobs in the database. This is because most projects that are using Google Cloud Tasks will also have multiple instances of the Vendure server.

You can call the endpoint /cloud-tasks/clear-jobs/X with the secret as Auth header to clear jobs older than X days. For example:

Will clear all jobs older than 1 day.

DEADLINE_EXCEEDED errors when pushing tasks to queue

When pushing multiple tasks concurrently to a queue in serverless environments, you might see DEADLINE_EXCEEDED errors. If that happens, you can instantiate the plugin with fallback: true to make the Google Cloud Tasks client fallback to HTTP instead of GRPC. For more details see https://github.com/googleapis/nodejs-tasks/issues/397#issuecomment-618580649

Request entity too large

This means the Job data is larger than NestJS's configured request limit. You can set a large limit in your vendure-config.ts:

We don't include this in the plugin, because it affects the entire NestJS instance

ER_OUT_OF_SORTMEMORY: Out of sort memory, consider increasing server sort buffer size on MySQL

If you get this error, you should create an index on the createdAt column of the job table:

The error is caused by the fact that the job_record.data column is a json column and can contain a lot of data. More information can be found here: https://stackoverflow.com/questions/29575835/error-1038-out-of-sort-memory-consider-increasing-sort-buffer-size