- 1 Manual for PetScan
- 1.1 Introduction
- 1.2 Defining your query
- 1.3 Know-how
- 1.4 Examples
- 1.4.1 Articles in a WikiProject
- 1.4.2 Dablinks within a WikiProject
- 1.4.3 Detecting pages that have an anomalous combination of namespace and category/ies
- 1.4.4 Find uncategorized photo contributions in Commons in a given language
- 1.4.5 Items with no statements
- 1.4.6 d:Help:Import Template:Bio from itwiki
- 1.4.7 Get the sitelinks for a certain project from a SPARQL query
- 1.4.8 Add your example here...
- 2 See also
- Now with examples! Now with information presented in a table!
PetScan can generate lists of Wikipedia (and related projects) pages or Wikidata items that match certain criteria, such as all pages in a certain category, or all items with a certain property. PetScan can also combine some temporary lists (here called "sources") in various ways, to create a new one. Sources include:
Pages from Wiki(m|p)edia
These are defined in the Categories, Page properties, and Templates&links tabs. You can ask for pages in category trees, with specific templates, or links from/to specific pages; limit your results to certain namespaces, bot/human edits, recent edits/page creation, etc. These three tabs represent the former CatScan2 functionality. Their query result is subsequently canned the "category source".
In this tab, you can add more sources, such as Wikidata SPARQL (WDQS) queries, or PagePile lists. You can also define how to combine multiple sources; by default, the subset (that is, only pages that occur in all sources) is returned in the final result. You can also specify which wiki you want your list to point to, e.g. if you combine Wikipedia and Wikidata results.
In this tab, you can annotate or filter your results further, e.g., return only Wikidata items that have no statements. Using any of these filters will convert your list to Wikidata.
Here, you can specify options for your list, e.g. the format (web page, wiki, PagePile, etc.). You can also further filter your results, e.g. with regular expressions on page titles/item labels. You can also replace the result list with a ranked list of missing topics ("redlinks").
Defining your query
The fields that can be set in the query form are as follows:
|Language||Select project language code, e.g. "en" for English or "de" for German. Select "commons" for Wikimedia Commons||"en"|
|Project||Wikimedia project to be searched (wikipedia, wiktionary, wikiversity, etc.)||"wikipedia"|
|Depth||Depth of the category trees to search. 0 means to not use subcategories.||"0"|
|Categories||List of categories, one per line without "category:" part.||Empty||Appending '|' and a number will set the depth for this category tree, overriding what was chosen in the Depth field|
|Negative Categories||List of categories as above. Only articles which are not included in these categories will be accepted.||Empty|
|Combination||How above categories should be used:
*Category list: Lists subcategories
*Subset: All pages that are in all category trees
*Union: All pages that are at least in one category tree
*Difference: All pages in only one of the category trees
*At least (N): All pages that are in at least N category trees
Options available currently are "subset" or "union".
|Namespaces||The namespaces to use as potential pages||Articles|
|Templates||Use only pages that
* Box 1:contain all of the given templates
* Box 2:contain one of the given templates
* Box 3:contain none of the given templates
Enter one template per line, without "template:" prefix. Each box may be qualified by selecting "Use talk page instead"
|Empty||This option seems only compatible with templates defined in "template:" namespace. It cannot be used with templates defined in "User:" namespace. It cannot be used in the "Creator:" or "Institution:" namespaces that are used at Wikimedia Commons|
|Last edit||Show pages whose last edit was or was not made by a bot, by an anonymous user, or is flagged||Either, either, either|
|Last change||Date or time period of the last change on the page in the format YYYYMMDDHHMMSS (shorter allowed)||"Only pages created during the above time window" allows you to look for first change instead|
|Size||File size or size range in bytes||Empty||Allows selection of articles whose files are greater than one cutoff and/or less than another cutoff|
|Links||Number or range of internal links on the page||Empty||Allows selection of articles with many or few links|
|Top categories||Feature which is not yet available.|
|Sort||Feature which is not yet available, which would set sorting criteria for output.|
|Manual list||Allows providing a list of (namespace prefixed) page names or wikidata items from specified project||The tricky part is specifying projects the correct codes are:
|Wikidata||Get wikidata, if available.|
|Format||Output format of the search results:
HTML : webpages
CSV : values in quotation marks , separated by commas
TSV : Tab Separated Values
WIKI : as Wikitable
PHP : as a PHP file
XML : as an XML file
|Do it!||Hit this to run the submission you have defined.|
PetScan ID (PSID)
As of 2016-04-04, every query that gets run in PetScan is recorded (anonymously!) and assigned a unique, stable, numeric identifier called PSID. You can use the PSID to
- run this PetScan query as an input in tools that support PSID (such as WD-FIST)
- fill in a "short URL": https://petscan.wmflabs.org/?psid=PSID will run the query with PSID, with all its settings
- expand programmatically on a previous query, by "overwriting" parameters: https://petscan.wmflabs.org/?format=wiki&psid=PSID will run the same query as before, but the output format will be wiki (instead of default HTML, or whatever was chosen originally).
- Only the query will be stored, not its results!
- Large queries (e.g. with many manual items) will not be stored. In that case, no PSID will be shown.
- Results with an empty checkbox have possible matches within the Wikidata set.
- the interwiki link petscan: can be used to generate shortcuts for permanent queries, eg. [[petscan:PSID]]
Create Wikidata items for Wikipedia articles that don't have one yet (Creator functionality)
- Set up a query that returns a list of Wikipedia (or other, non-Wikidata project) pages, or paste a list into "Other sources/Manual list"
Under the "Page properties" tab, you should select "Redirects=No"This is done automatically now; you can change it back if you really want redirects in your list!
- Under the "Wikidata" tab, select "Only pages without item" for the "Wikidata" option
- Run query
- Your results will have additional elements next to the "results" header (unless you are not logged into WiDaR, in which case you will see an appropriate link instead)
- All pages for which there is no exact match in any label or alias on Wikidata are checked by default.
- You can check/uncheck boxes manually now, if required.
- You can add default statements into the statements box, which will be added to all your new items. So, if you only create items for people, add "P31:Q5". You can add multiple statements this way (one per line). Do note that the case of P/Q needs to be in upper case — otherwise it will fail quietly.
- Click the green "Process commands" button. New items will be created (and statements added) for all checked pages.
- You can always abort the process via the red button (appears once the process has started).
- Once an item has been created, and all statements have been added, the respective page row will be removed completely from the interface.
- Use the remaining entries to manually search and match the Wikipedia pages to existing Wikidata items, where possible.
Example: Biologists by field of research on English Wikipedia (query will take ~30 seconds; might not show any results, depending on how recently someone used this example link to create items)
Add/remove statements for Wikidata items
It is possible to add or remove statements for Wikidata items with PetScan. For this it is crucial that you choose "Wikidata" in "Other sources -> Use Wiki". Then you will see the command box next to the number and can continue as described in the previous section.
Articles in a WikiProject
A request on the Talk page of this Manual: Find all mainspace articles within "WikiProject UK geography". Starting with a default PetScan submission form, just add "WikiProject UK geography" to the first box of the Categories row, and, just below, select "Use talk pages instead". Here is the query filled out. Hit "Do it!" at bottom. When run on 16 August 2015, the query required 1.5 seconds to run, and yielded a list of 21,408 articles. The list appears BELOW the submission form (which remains on your screen), so you have to scroll down to see the results.
Editors working on disambiguation seek to enlist members of a content area WikiProject, specifically WikiProject Canada, to help. A PetScan report is designed to find all articles having ambiguous links that are within the given WikiProject. Criteria applied:
- Articles having ambiguous links are within "Category:All articles with links needing disambiguation", so paste "All articles with links needing disambiguation" into the PetScan Categories field.
- Depth is set arbitrarily to 9, meaning that articles as far as 9 subcategories down from the "needing disambiguation" parent category will be found. (Searching to that depth is not necessary in this case but doesn't hurt.)
- Articles within WikiProject Canada have "Template:WikiProject Canada" on their talk pages, so paste "WikiProject Canada" into PetScan's "Has any of these templates" field, and just below select "Use talk pages instead" as a qualifier.
- Only regular articles, not disambiguation pages, are wanted, and disambiguation pages are distinguished by having template:disambiguation, so paste "Disambiguation" into PetScan's "Has none of these templates" field, and make sure "Use talk pages instead" is not selected.
- These criteria are implemented by this PetScan submission form, filled out. To submit the query, select "Do it!" at the bottom.
- When submitted on 16 August 2015, the query took 31 seconds to run, and results were a list of 255 articles. The results show BELOW the PetScan submission form, which remains in place, so you may see no change on your screen. You have to know to scroll down to find the results! That request was run with default Output format "HTML".
- To obtain the results in a Wikitable, in order to share them at a subpage of the WikiProject, the request could be revised to select Format "WIKI". This time the results, in wikitable markup, replace the PetScan submission form on your screen.
- To make a more useful list for disambiguators, set up so that DabSolver will open up on any item clicked, a several step process can be followed. Here the results were saved to Tab-Separated format instead, then brought into Excel, then a column was composed which concatenated simple text strings with the results, then that resulting column was copy-pasted. The results were pasted over to the English language Wikipedia page w:Wikipedia:Canadian Wikipedians' notice board/ArticlesNeedingDisambiguation2015-08-17 and were posted also within a scrolling window in discussion at the WikiProject Canada talk page. --Doncram (talk) 19:50, 24 August 2015 (UTC) link adjusted. DexDor (talk) 06:58, 29 March 2016 (UTC)
Detecting pages that have an anomalous combination of namespace and category/ies
PetScan can be used to find pages that are in a category (or combination of categories) that is not appropriate for pages in a particular namespace - e.g. Wikipedia administration pages that are in a category that should only contain encyclopedic articles. This can then be fixed (e.g. by moving an article to the correct namespace or by editing a discussion to insert a missing ":" where a category is being referred to). The first step in this process is to identify (using PetScan) categories that cause incorrect categorization (e.g. Wikipedia administration categories that are in article categories).
DexDor (English Wikipedia)
Find uncategorized photo contributions in Commons in a given language
(Based on Grants:Learning patterns/Treasures or landmines: detecting uncategorized, language-specific uploads in Commons. See the motivation and full explanation there! Thank you to wikimedia user User:Spiritia and other contributors/commenters there for contributing this! )
Run a query using PetScan with the following settings:
Language = commons Project = wikimedia Depth = 1 Categories = Media needing categories Combination = ☑ Subset Namespaces = ☑ File Templates : Has all of these templates = <your language code> Format: ☑ Extended data for files ☑ File usage data
The English language code is "en"; the Romanian language code is "ro". To find uncategorized photos uploaded by users using Romanian language, a version of the query (with html output, and without autorun) is:
As of 15 March 2016, after hitting "run" the query requires about 105 seconds to finish, and yields 1748 uncategorized photos.
- The "Language =" field is not used to select the desired language; the desired language code is set in the "Template" field instead.
- The language code is case-sensitive in the query! So for example use "ro" not "RO".
- To generate the results there, Format: ☑ Wiki was chosen, instead of the default output of Html.
Enjoy! Thanks again to User:Spiritia especially!
Items with no statements
The option "Has no statements" can be used to find:
- items without statements for a category at Wikipedia (sample: en:Category:United States geography stubs)
- items without statements for an entire Wikipedia language version (sample: "sowiki")
Steps to import the template, some with PetScan.
- Indicate the project on the 'Categories' tab. E.g.
defor Language and
wikipediain Project to use the German language edition of Wikipedia.
- In Other sources enter your SPARQL query
- Make sure to select From categories from the Use wiki options
- Press Do it
This could be useful to get the pageviews of a certain set of pages, based on a SPARQL query. You can save this to a Pagepile (check the Output tab), then enter that Pagepile ID in Massviews Analysis (select 'Page Pile' from the Source dropdown).