Abstract Wikipedia/Abstract developer cheatsheet

From Meta, a Wikimedia project coordination wiki
Jump to navigation Jump to search

This is a Wikifunctions development cheatsheet and collection of helpful links.

Gerrit Cheatsheet[edit]

These are the gerrit and git review related summaries that I created for my own use and reference, but you can find all this information more extensively in:

Flow for creating and pushing a new patch:

$ git checkout -b mynewfeature
# do stuff and and stage
$ git commit
# Write message and description with Bug: <Phabricator task> at the bottom
$ git review
# Maybe do some more modifications on your patch
$ git commit --amend
$ git review

Git will retain the commit history of the present branch when you create a new branch. If you want a "clean" branch, create it from origin/master.

$ git fetch  # to make sure you have the most up-to-date version of master
$ git checkout origin/master
$ git checkout -b my-new-branch

Flow for reviewing or amending a patch:

$ git review -d <gerrit patch ID> # you can see it on the gerrit url of a given patch
# do stuff and and stage
$ git commit --amend
# modify the commit message or leave it the same
$ git review

Rebases don't work? Same flow, do a normal rebase, commit amend and review:

$ git checkout master
$ git pull origin master
$ git checkout <your branch>
$ git rebase master
$ git commit --amend
$ git review

Other git review stuff to remember:

$ git review -s --verbose  # Setup git review
$ git review -d 643568     # -d change (--download=change) downloads a change ID into a local branch.
$ git review
$ git review -R            # Git review but without performing rebase (--no-rebase)
$ git review -f            # Submit a change for review and close local branch (--finish)

Updating mediawiki installation[edit]

Follow environment installation instructions from https://gerrit.wikimedia.org/g/mediawiki/core/+/HEAD/DEVELOPERS.md

  • Go to mediawiki core and do git pull
  • This does not update submodules directly, and you shouldn't if you don't want to lose your extension changes. You might need to update your skin version once in a while. For that, go to skins/Vector and do git pull
  • Remove directory cache/sqlite
  • Rename LocalSettings.php
  • Run docker compose up -d
  • Run docker compose exec mediawiki composer update
  • Run the installation script docker compose exec mediawiki /bin/bash /docker/install.sh
  • Copy your personal changes to the newly generated LocalSettings.php
  • Run docker compose exec mediawiki php maintenance/update.php

Wikifunctions development workflow tips[edit]

WikiLambda starting guide[edit]


Git Submodules[edit]

git submodule update --init --recursive

Updating Submodules[edit]

Once there are new changes in function-schemata, it is convenient to keep the other projects updated to schemata's most recent version. To do so, there is a convenience script in all three projects (WikiLambda, function-orchestrator and function-evaluator). To do a synchronous function-schemata pull through, run the following commands from the updated master branch of each of these repositories:

$ ./bin/updateSubmodule.sh
$ git review

This will generate a patch for each repo with a schemata pullthrough to the lastest version in master and a commit summary containing details of every new update that the new schemata version includes.

WikiLambda PHPunit tests[edit]

About PHP unit testing: https://www.mediawiki.org/wiki/Manual:PHP_unit_testing/Running_the_tests

# Run all test suite using local settings:
$ docker-compose exec mediawiki php tests/phpunit/phpunit.php

# Run WikiLambda tests:
$ docker-compose exec mediawiki php tests/phpunit/phpunit.php extensions/WikiLambda/tests/phpunit/

# Run tests and generate coverage in HTML
$ docker-compose exec mediawiki php tests/phpunit/phpunit.php --coverage-html="./docs/coverage" extensions/WikiLambda/tests/phpunit/ --whitelist="./extensions/WikiLambda"

# Run tests and generate coverage in txt (and print it)
$ docker-compose exec mediawiki php tests/phpunit/phpunit.php --coverage-text="./docs/coverage.txt" extensions/WikiLambda/tests/phpunit/ --whitelist="./extensions/WikiLambda" ; cat ./docs/coverage.txt

PHP linting[edit]

# Apply automatic fixes
docker-compose exec mediawiki composer fix extensions/WikiLambda/

# Check other PHP CodeSniffer errors (fix manually)
docker-compose exec mediawiki composer phpcs extensions/WikiLambda/

WikiLambda front-end tests[edit]

# Install npm dependencies
npm ci 
# Run tests
npm test

Function-schemata tests[edit]

# Install npm dependencies
npm ci

# Run tets
npm test

To run the tests without the linting step:

npm run test:nolint

If you want to run a specific test, you can do:

npm run test:nolint -- -f <substring matching test name>

# For example:
npm run test:nolint -- -f "canonical lists"

Other entrypoints and composer commands[edit]

Collected by James: https://phabricator.wikimedia.org/T285275

Debugging and logs[edit]

Add to your LocalSettings.php:

# Load Development Settings
# (Alternatively can run phpunit or any maintenance script with the -mwdebug option)
require "$IP/includes/DevelopmentSettings.php";

With DevelopmentSettings, logs are added in the default directory ./cache (mw-error.log, mw-dberror.log, etc.) You can print on these logs using the default groups (error, exception…) with wfDebugLog:

wfDebugLog("error", "Here's some error information about the variable $var" );

You can also create your custom log groups following https://www.mediawiki.org/wiki/Manual:How_to_debug#Creating_custom_log_groups

Debugging and logs in NodeJS services[edit]

Logging messages in function-orchestrator, function-evaluator and function-schemata can be done simply using console.log or console.error. To see the messages, one must rebuild the project with blubber and re initialize the docker containers.

For example, after adding console.log statements in function-orchestrator or in its submodule function-schemata, run in the function-orchestrator root directory:

blubber .pipeline/blubber.yaml development | docker build -t local-orchestrator -f - .

Note (CM): I have the above command saved as a shell script since I use it dozens of times a day :).

And once built, restart your mediawiki docker containers

docker compose up -d

Use docker compose logs (or docker-compose logs in case using docker-compose <v2) to view the logs:

# Show all container logs and follow
docker compose logs -f

# Show only function-orchestrator logs and follow
docker compose logs function-orchestrator -f

# Alternate logs command--runs from any directory with cleaner output, but is less comprehensible.
docker logs mediawiki-function-orchestrator-1

To log exceptions from the python executor (function-evaluator/executors/python3/executor.py):

import logging
logging.exception("this is some error")

And similarly, rebuild the function-evaluator with blubber and reinitialize the mediawiki docker compose, then view the logs for mediawiki-function-evaluator-1.

Testing NodeJS Services[edit]


The easiest way I have found is to install the orchestrator locally:

# from function-orchestrator directory
npm install

# make mocha easy to find (this can go in .bashrc or OS equivalent)
alias mocha='./node_modules/mocha/bin/mocha'

# run tests (you can filter like "mocha -g 'substring of test name'")

Note: if "mocha" gives a "Cannot find module" error, run the tests using "npm run test". (This "test" script, which also runs lint, is defined in package.json. "npm run" appends a few more directories onto $PATH.)

IMPORTANT: You need to terminate the docker images for these tests to run properly:

docker-compose down

To run only one test, you can use the flag -g TEXT, where text is a string matched against all the names of the describe(name, callback) and it(name, callback) calls:

# If you are running tests with npm:
npm run test:nolint -- -g 'TEXT'

# If you are running tests directly with mocha:
mocha -g 'TEXT'


It's best to test using the docker images. First, I have a shell command build-run-blubber <Blubber variant> <Docker image name>:

# place in /usr/local/bin/build-run-blubber or elsewhere on $PATH

set -e

if [ -z $2 ]; then
    echo "Please invoke as $ build-run-blubber <Blubber variant> <image name>."
    exit 1

blubber .pipeline/blubber.yaml ${1} | \
    docker build -t ${2} -f - . && \
    docker run ${2}

The function-evaluator has four test variants which run in CI; they can be run as follows:

# run these commands from function-evaluator directory
# run the full test suite for the Node service, including eslint for JavaScript files
build-run-blubber test test

# run the tests for the Python executor
build-run-blubber test-python3-executor testpy

# run the tests for the JS executor
build-run-blubber test-javascript-executor testjs

# run format testing for python files
build-run-blubber format-python3 formatpy

carbon-neutral, packet-negative Green Anarchist coder life[edit]

Every time you run Blubber, you are calling to an external service. This is 1) wasteful and 2) not a good thing to depend on if you intend to write code e.g. while traveling. For that reason, I recommend saving all generated Dockerfiles locally (e.g. blubber .pipeline/blubber.yaml somevariant > SOMEVARIANT.DOCKER) and using SOMEVARIANT.DOCKER in the above commands.

For example, instead of

blubber .pipeline/blubber.yaml development | docker build -t local-orchestrator -f - .

you can save a Dockerfile as above and then run

docker build -t local-orchestrator -f SOMEVARIANT.DOCKER .

Maintenance scripts[edit]

Reload built-in data[edit]

function-schemata/data/definitions contains all the JSON files for the canonical built-in ZObjects that a blank Wikifunctions environment must have in a blank installation. When following the general installation instructions, you will be asked to run the MediaWiki maintenance/update.php script, which loads into the database all the built-in data if they are not there yet. However, the update script will not overwrite any existing data.

If you need to restore all the built-in files to their original state (for example, there were changes in the function model and all the data definitions are updated) you can do a totally blank installation starting with clearing the database (as explained here), or you can run the following script:

# From your MediaWiki installation directory, do
$ docker compose exec mediawiki php extensions/WikiLambda/maintenance/reloadBuiltinData.php

This script will not clear your database: all the custom ZObjects will remain, and only the built-in ZObjects will be overwritten.

Generate the data dependencies file[edit]

function-schemata/data/definitions/dependencies.json file contains a JSON that maps every Zid with the Zids that must be inserted before that. After some changes, the dependencies JSON might change, and it will be required to regenerate it again. To do so:

# From your WikiLambda directory make sure that function-schemata is updated in master and create a new branch
$ cd function-schemata
$ git checkout master
$ git pull origin master
$ git checkout -b <branch_name>
# From your mediawiki core directory, run the generateDependenciesFile maintenance script
$ docker compose exec mediawiki php extensions/WikiLambda/maintenance/generateDependenciesFile.php

You will see changes generated in the dependencies.json file, push the patch and update WikiLambda function-schemata submodule to HEAD.

Links of Interest[edit]

The Wikimedia Foundation



Abstract Wikipedia

Working tools

Mailing lists

Complete list of mailing lists:

Some interesting mailing lists you can sign up to: