Background job processing: running jobs, configuring the queue, and creating custom jobs.
The MediaWiki job queue is a deferred task system that moves expensive or time-sensitive operations out of the web request cycle. When a user saves a page, for example, refreshing the HTML cache of every page that transcluded that page would be too slow to do inline — so MediaWiki enqueues a job and processes it asynchronously.
A unit of deferred work. Each job has a type, a target title, and a parameters array. Jobs subclass MediaWiki\JobQueue\Job and implement a run() method.
JobQueue
A per-type queue backed by a storage medium (database or Redis). Each job type has its own queue, configured independently.
JobQueueGroup
Aggregates all per-type queues for a wiki. Used to enqueue jobs without knowing which backend a specific type uses.
JobRunner
The execution engine, invoked by runJobs.php or triggered inline via $wgJobRunRate. Dequeues and runs jobs, handling retries on failure.
Each job queue backend guarantees at-least-once execution — a job will run at least once, but may run more than once if the runner crashes before acknowledging completion. Job implementations should therefore be idempotent.
The primary way to process jobs is via the runJobs.php maintenance script:
# Run all pending jobs until the queue is emptyphp maintenance/run.php runJobs# Run at most 500 jobsphp maintenance/run.php runJobs --maxjobs 500# Run for at most 60 seconds of wall-clock timephp maintenance/run.php runJobs --maxtime 60# Run only jobs of a specific typephp maintenance/run.php runJobs --type htmlCacheUpdate# Use 4 parallel worker processesphp maintenance/run.php runJobs --procs 4# Ignore throttling limitsphp maintenance/run.php runJobs --nothrottle# Output a JSON summary instead of human-readable textphp maintenance/run.php runJobs --result json# Stay running and wait for new jobs instead of exitingphp maintenance/run.php runJobs --wait
By default, MediaWiki also runs a small number of jobs inline at the end of web requests, controlled by $wgJobRunRate. Set it to 0 to disable inline execution and rely entirely on runJobs.php.
// LocalSettings.php// Run 1 job per web request on average (default)$wgJobRunRate = 1;// Run no jobs inline (use a cron job instead)$wgJobRunRate = 0;// Run 2 jobs per 10 requests on average$wgJobRunRate = 0.2;
For production wikis under significant load, set $wgJobRunRate = 0 and drive runJobs.php from a cron job or a dedicated worker process. This prevents slow jobs from affecting page response times.
# Run jobs every minute, for up to 60 seconds per invocation* * * * * www-data php /var/www/wiki/maintenance/run.php runJobs --maxtime 60 >> /var/log/mediawiki-jobs.log 2>&1
Displays the number of pending jobs, optionally broken down by type.
# Show total count of pending jobsphp maintenance/run.php showJobs# Show per-type breakdownphp maintenance/run.php showJobs --group# List individual jobs of a specific typephp maintenance/run.php showJobs --list --type htmlCacheUpdate# Filter by job state: unclaimed, delayed, claimed, abandonedphp maintenance/run.php showJobs --list --status unclaimed# Limit the number of jobs listedphp maintenance/run.php showJobs --list --limit 50
By default, jobs are stored in the job table in the wiki’s primary database (JobQueueDB). For high-traffic wikis, Redis provides better performance and monitoring options.
use MediaWiki\JobQueue\JobSpecification;use MediaWiki\MediaWikiServices;$services = MediaWikiServices::getInstance();$jobQueueGroup = $services->getJobQueueGroup();$job = new JobSpecification( 'myCustomJob', [ 'pageId' => $pageId ], [], $title);// lazyPush defers enqueueing until after the response is sent$jobQueueGroup->lazyPush( $job );
Prefer lazyPush() over push() in web request contexts. lazyPush() defers the enqueue until after the response has been flushed to the client, keeping request latency low. Use push() only when you need to surface enqueue failures to the caller.
To prevent duplicate jobs from piling up, override getDeduplicationInfo() and set $this->removeDuplicates = true:
public function __construct( Title $title, array $params ) { parent::__construct( 'myCustomJob', $title, $params ); // Remove duplicates based on deduplication info $this->removeDuplicates = true;}public function getDeduplicationInfo(): array { $info = parent::getDeduplicationInfo(); // Only keep the page ID in deduplication info $info['params'] = [ 'pageId' => $this->params['pageId'] ]; return $info;}