Abstract Wikipedia/Abstract developer cheatsheet

From Meta, a Wikimedia project coordination wiki
Jump to navigation Jump to search

This is a Wikifunctions development cheatsheet and collection of helpful links.

Wikifunctions Repositories[edit]

The Wikifunctions stack consists of four repositories, currently spread around Gerrit and GitLab (soon to be fully on GitLab, once the rest of the Foundation and so the entire MediaWiki stack migrates):


WikiLambda is the MediaWiki extension for Wikifunctions. It contains the data persistence layer, the front-end Vue interface, and the interface via APIs to make requests to the function orchestrator backend. WikiLambda is currently hosted on Gerrit:


To clone this repository, follow the instructions in the Gerrit page:

git clone "ssh://gengh@gerrit.wikimedia.org:29418/mediawiki/extensions/WikiLambda"

This repository depends on the function-schemata submodule.

More documentation can be found on:

Function Orchestrator[edit]

Function orchestrator is a Node service that manages the execution of Function Calls (Z7s). It is the point of interoperation between MediaWiki/WikiLambda and the function evaluator, which executes native code in various programming languages. The function orchestrator repository is hosted on GitLab:


To clone this repository, follow the instructions in the README file:

git clone --recurse-submodules git@gitlab.wikimedia.org:repos/abstract-wiki/wikifunctions/function-orchestrator.git

This repository depends on the function-schemata submodule.

More documentation can be found on:

Function Evaluator[edit]

The evaluator service executes user-written 'native' code in a variety of programming languages. The repository consists of the evaluator service and a variety of language-specific executors. The function evaluator repository is hosted on GitLab:


To clone this repository, follow the instructions in the README file:

git clone --recurse-submodules git@gitlab.wikimedia.org:repos/abstract-wiki/wikifunctions/function-evaluator.git

This repository depends on the function-schemata submodule.

More documentation can be found on:

Function Schemata[edit]

This repository is a shared set of JSON schemata for the Wikifunctions project, to achieve a "single version of the truth" on what counts as a structurally valid ZObject. It is used as a git sub-module for the function-orchestrator and function-evaluator services, and the WikiLambda MediaWiki extension. The function schemata repository is hosted on GitLab:


To clone this repository, do:

git clone git@gitlab.wikimedia.org:repos/abstract-wiki/wikifunctions/function-schemata.git

To update the repository as submodules of each project, go to each project root directory and run:

git submodule update --init --recursive

Gerrit Cheatsheet[edit]

These are the Gerrit and git review related summaries that I created for my own use and reference, but you can find all this information more extensively in:

Flow for creating and pushing a new patch:

$ git checkout -b mynewfeature
# do stuff and and stage
$ git commit
# Write message and description with Bug: <Phabricator task> at the bottom
$ git review
# Maybe do some more modifications on your patch
$ git commit --amend
$ git review

Git will retain the commit history of the present branch when you create a new branch. If you want a "clean" branch, create it from origin/master.

$ git fetch  # to make sure you have the most up-to-date version of master
$ git checkout origin/master
$ git checkout -b my-new-branch

Flow for reviewing or amending a patch:

$ git review -d <gerrit patch ID> # you can see it on the gerrit url of a given patch
# do stuff and and stage
$ git commit --amend
# modify the commit message or leave it the same
$ git review

Rebases don't work? Same flow, do a normal rebase, commit amend and review:

$ git checkout master
$ git pull origin master
$ git checkout <your branch>
$ git rebase master
$ git commit --amend
$ git review

Other git review stuff to remember:

$ git review -s --verbose  # Setup git review
$ git review -d 643568     # -d change (--download=change) downloads a change ID into a local branch.
$ git review
$ git review -R            # Git review but without performing rebase (--no-rebase)
$ git review -f            # Submit a change for review and close local branch (--finish)

Updating mediawiki installation[edit]

Follow environment installation instructions from https://gerrit.wikimedia.org/g/mediawiki/core/+/HEAD/DEVELOPERS.md

  • Go to mediawiki core and do git pull
  • This does not update submodules directly, and you shouldn't if you don't want to lose your extension changes. You might need to update your skin version once in a while. For that, go to skins/Vector and do git pull
  • Remove directory cache/sqlite
  • Rename LocalSettings.php
  • Run docker compose up -d
  • Run docker compose exec mediawiki composer update
  • Run the installation script docker compose exec mediawiki /bin/bash /docker/install.sh
  • Copy your personal changes to the newly generated LocalSettings.php
  • Run docker compose exec mediawiki php maintenance/run.php update

Rights and Privileges[edit]

As you all probably know, we have landed the first version of user rights and privileges,

There are two user groups with different degrees of authority, so WikiLambda thinks twice before saving edits. Our rights and privileges system means things like:

Wikilambda user groups are:

  • functioneers: they are contributors to functions.
  • functionmaintainers: the function keepers, very smart, we can trust them.

To develop and test locally like we've done till now, we have to add our user to both groups. To give your Admin user the special rights for creating and editing ZObjects, run:

$ php maintenance/run.php createAndPromote --custom-groups functioneer,functionmaintainer --force Admin
$ # or for docker compose environment:
$ docker compose exec mediawiki php maintenance/run.php createAndPromote --custom-groups functioneer,functionmaintainer --force Admin

You can also edit your own user rights using the Special:UserRights page in your localhost installation: http://localhost:8080/wiki/Special:UserRights

Wikifunctions development workflow tips[edit]

WikiLambda starting guide[edit]


Vue Devtools[edit]

To work with our Vue (front-end) code, it's useful to employ the Vue Devtools. To help ensure smooth functioning of the Vue Devtools, add the following line to mediawiki/LocalSettings.php:

$wgVueDevelopmentMode = true;

If you run in "debug" mode (set $wgResourceLoaderDebug=true; in your LocalSettings.php file) each file will be shipped individually to your browser so you can see the lines more easily.

It can also be really helpful to check Disable cache, under the Devtools Network tab.

Set up mobile front-end in Dev environment[edit]

Follow the steps in the mediawiki docker mobile web setup document.

Unminified Front-End JS code[edit]

Also it can be quite useful to debug using the unminified javascript source code on the Front-End. To do so, just add &debug=true to the URL and check out again your browser JavaScript debugger.

Git Submodules[edit]

git submodule update --init --recursive

Updating Submodules[edit]

Once there are new changes in function-schemata, it is convenient to keep the other projects updated to schemata's most recent version. To do so, there is a convenience script in all three projects (WikiLambda, function-orchestrator and function-evaluator). To do a synchronous function-schemata pull through, run the following commands from the updated master branch of each of these repositories:

$ ./bin/updateSubmodule.sh
$ git review

This will generate a patch for each repo with a schemata pull-through to the latest version in master and a commit summary containing details of every new update that the new schemata version includes.

WikiLambda PHPunit tests[edit]

About PHP unit testing: https://www.mediawiki.org/wiki/Manual:PHP_unit_testing/Running_the_tests

# mediawiki should be current directory for all these commands

# Run all test suite using local settings:
$ docker-compose exec mediawiki php tests/phpunit/phpunit.php

# Run WikiLambda tests:
$ docker-compose exec mediawiki php tests/phpunit/phpunit.php extensions/WikiLambda/tests/phpunit/

# Run tests and generate coverage in HTML
$ docker-compose exec mediawiki php tests/phpunit/phpunit.php --coverage-html="./docs/coverage" extensions/WikiLambda/tests/phpunit/ --whitelist="./extensions/WikiLambda"

# Run tests and generate coverage in txt (and print it)
$ docker-compose exec mediawiki php tests/phpunit/phpunit.php --coverage-text="./docs/coverage.txt" extensions/WikiLambda/tests/phpunit/ --whitelist="./extensions/WikiLambda" ; cat ./docs/coverage.txt

In the 2 coverage commands above, in some settings, it may be necessary to insert -dxdebug.mode=coverage between php and tests/phpunit/phpunit.php.

PHP linting[edit]

# Apply automatic fixes
docker-compose exec mediawiki composer fix extensions/WikiLambda/

# Check other PHP CodeSniffer errors (fix manually)
docker-compose exec mediawiki composer phpcs extensions/WikiLambda/

WikiLambda front-end tests[edit]

# With mediawiki/extensions/WikiLambda as current directory
# Install npm dependencies
npm ci 
# Run tests
npm test

Function-schemata tests[edit]

# Install npm dependencies
npm ci

# Run tets
npm test

To run the tests without the linting step:

npm run test:nolint

If you want to run a specific test, you can do:

npm run test:nolint -- -f <substring matching test name>

# For example:
npm run test:nolint -- -f "canonical lists"

If you want to run a specific test suite, you can do:

npm run test:unit tests/jest/integration/EditFunction.test.js

npm run test:unit tests/jest/store/modules/zobject.test.js

Other entry points and composer commands[edit]

Collected by James: https://phabricator.wikimedia.org/T285275

Debugging and logs[edit]

Logs are added in the default directory ./cache (mw-error.log, mw-dberror.log, etc.) You can print on these logs using the default groups (error, exception…) with wfDebugLog:

wfDebugLog("error", "Here's some error information about the variable $var" );

You can also create your custom log groups following https://www.mediawiki.org/wiki/Manual:How_to_debug#Creating_custom_log_groups

Debugging and logs in NodeJS services[edit]

Logging messages in function-orchestrator, function-evaluator and function-schemata can be done simply using console.log or console.error. To see the messages, one must rebuild the project with blubber and re initialize the docker containers.

For example, after adding console.log statements in function-orchestrator or in its submodule function-schemata, run in the function-orchestrator root directory:

blubber .pipeline/blubber.yaml development | docker build -t local-orchestrator -f - .

Note (CM): I have the above command saved as a shell script since I use it dozens of times a day :).

And once built, restart your MediaWiki docker containers

docker compose up -d

Use docker compose logs (or docker-compose logs in case using docker-compose <v2) to view the logs:

# Show all container logs and follow
docker compose logs -f

# Show only function-orchestrator logs and follow
docker compose logs function-orchestrator -f

# Alternate logs command--runs from any directory with cleaner output, but is less comprehensible.
docker logs mediawiki-function-orchestrator-1

To log exceptions from the python executor (function-evaluator/executors/python3/executor.py):

import logging
logging.exception("this is some error")

And similarly, rebuild the function-evaluator with blubber and reinitialize the MediaWiki docker compose, then view the logs for mediawiki-function-evaluator-1.

Testing NodeJS Services[edit]


The easiest way I have found is to install the orchestrator locally:

# from function-orchestrator directory
npm install

# make mocha easy to find (this can go in .bashrc or OS equivalent)
alias mocha='./node_modules/mocha/bin/mocha'

# run tests (you can filter like "mocha -g 'substring of test name'")

Note: if "mocha" gives a "Cannot find module" error, run the tests using "npm run test". (This "test" script, which also runs lint, is defined in package.json. "npm run" appends a few more directories onto $PATH.)

IMPORTANT: You need to terminate the docker images for these tests to run properly:

docker-compose down

To run only one test, you can use the flag -g TEXT, where text is a string matched against all the names of the describe(name, callback) and it(name, callback) calls:

# If you are running tests with npm:
npm run test:nolint -- -g 'TEXT'

# If you are running tests directly with mocha:
mocha -g 'TEXT'


It's best to test using the docker images. First, I have a shell command build-run-blubber <Blubber variant> <Docker image name>:

# place in /usr/local/bin/build-run-blubber or elsewhere on $PATH

set -e

if [ -z $2 ]; then
    echo "Please invoke as $ build-run-blubber <Blubber variant> <image name>."
    exit 1

blubber .pipeline/blubber.yaml ${1} | \
    docker build -t ${2} -f - . && \
    docker run ${2}

The function-evaluator has four test variants which run in CI; they can be run as follows:

# run these commands from function-evaluator directory
# run the full test suite for the Node service, including eslint for JavaScript files
build-run-blubber test test

# run the tests for the Python executor
build-run-blubber test-python3-executor testpy

# run the tests for the JS executor
build-run-blubber test-javascript-executor testjs

# run format testing for python files
build-run-blubber format-python3 formatpy

carbon-neutral, packet-negative Green Anarchist coder life[edit]

Every time you run Blubber, you are calling to an external service. This is 1) wasteful and 2) not a good thing to depend on if you intend to write code e.g. while traveling. For that reason, I recommend saving all generated Dockerfiles locally (e.g. blubber .pipeline/blubber.yaml somevariant > SOMEVARIANT.DOCKER) and using SOMEVARIANT.DOCKER in the above commands.

For example, instead of

blubber .pipeline/blubber.yaml development | docker build -t local-orchestrator -f - .

you can save a Dockerfile as above and then run

docker build -t local-orchestrator -f SOMEVARIANT.DOCKER .

Maintenance scripts[edit]

Reload built-in data[edit]

function-schemata/data/definitions contains all the JSON files for the canonical built-in ZObjects that a blank Wikifunctions environment must have in a blank installation. When following the general installation instructions, you will be asked to run the MediaWiki maintenance/run.php update script, which loads into the database all the built-in data if they are not there yet. However, the update script will not overwrite any existing data.

If you need to restore all the built-in files to their original state (for example, there were changes in the function model and all the data definitions are updated) you can do a totally blank installation starting with clearing the database (as explained here), or you can run the following script:

Always make sure that your function-schemata submodule is up to date running, from the WikiLambda directory:

$ git submodule update --init --recursive

To run the script, from your MediaWiki installation directory, do:

$ docker compose exec mediawiki php extensions/WikiLambda/maintenance/reloadBuiltinData.php 

If items fail due to label clashes, do:

$ docker compose exec mediawiki php extensions/WikiLambda/maintenance/reloadBuiltinData.php --force

This script will not clear your database: all the custom ZObjects will remain, and only the built-in ZObjects will be overwritten. To clear the database do:

$ docker compose exec mediawiki php extensions/WikiLambda/maintenance/reloadBuiltinData.php --force --clear

Supporting Contributors[edit]

Allow list for new contributors[edit]

To prevent a user from uploading malicious code that can be executed by the CI servers, code contributors need to be added to an Allow list before their patches get executed by the pipelines. You can read more info in the MediaWiki Continuous integration documentation.

To add a new user to the allow list, add the user's primary Gerrit e-mail address and push a patch like this one adding our dearest contributor Lindsay. Then you can ping the RelEng team via the #wikimedia-releng IRC channel.

Force Zuul to run tests[edit]

If you are on the list, you can force Zuul to run all tests on a patchset by adding a comment beginning with the word recheck in Gerrit

Links of Interest[edit]

The Wikimedia Foundation



Abstract Wikipedia

Working tools

Mailing lists

Complete list of mailing lists:

Some interesting mailing lists you can sign up to: