Please add to this list.
- Mass-upload tool
- Improvement, completion of mass uploader that is easy to use and explain.
- Translation request tool
- A way to coordinate translation needs across projects. (Danny B.)
- Magnus' suggestions
"Page view stats per wiki, page, and month...
- Currently the files on the toolserver are per hour, and it's a pain to summarize them all...
- A database I can query would even be better."
- UPDATE: WMF Engineering is working on this at high priority, as part of the ReportCard project.
- Library/archives-specific tools
- Discuss varying needs and opportunities for library tools: WikiSource, transcription, citations.
- In particular, to offer to OPAC providers.
- Live example: National Library of Australia (click the Cite button, then pick Wikipedia)
- a Wikipedia Citation Extension for OPAC item pages
- Given a page on a Web OPAC, scrape data (from the MARC?) and generate a Wikipedia-style citation
- Support Firefox and Chrome
- Major moving parts:
- Logic: Given N fields, make a Wikipedia Citation
- Platform support: for major OPAC vendors O1..On, identify methods to scrape fields F1...Fn
- This is what Grant Dickie, Asaf Bartov, Djembayz and Danny B. ended up working on. Code on GitHub, documentation on the tech Etherpad.
GLAM-Wiki out in the wild... and other inspiration for potential future tool collaborations.
- NARA's Wikisource transcription page
- Open Access Media Importer (Proof of concept)
- Powerhouse Museum easy Wikipedia citation link in their collections database
- Balboa Park's Sammu (Synchronized Automatic Media and Metadata Uploader, Uses Flickr, not Commons.)
- Children's Museum use of Wikipedia Widget
Status of Bulk Upload tool
- Wikimedia Bulk Uploader (developed at GLAMcamp NYC)
" There are 3 problems with the mass uploader from GLAMcampNYC:
- It runs on the toolserver, which hasn't been incredibly reliable lately. I really need to find a new home for it. If anyone knows of a reliable Linux server I could run it on, let me know.
- It's only an uploader and doesn't have a meta-data ingester. You still have to write a custom script for parsing whatever meta-data you have into its database.
- It only works in English and doesn't have any localizability currently.
Plus, it was written in a few days so it's still rough around the edges and likely has some bugs. There's also the mass uploader that was developed for Wiki Loves Monuments, which is based on the UploadWizard. I might want to try playing around with this one when I come to GLAMcampDC, as it could be a better starting point for similar tools."
- - Update from Kaldari, pulled from on-list discussion.
Data ingestion tool
At the moment we have tools to do specific batch uploads. These tools don't share code so we have a lot of code duplication. Would be nice to have all the useful function abstracted, cleaned up and put in a library as part of Pywikipedia so it's much easier to create a new batch uploading bot.
- Multichill's sources
- Nara uploader link
- Commons:Category:Data ingestion layout templates Multichill 22:04, 7 February 2012 (UTC)
Upload Wizard Campaigns
This is the uploading system that was used for last year's Wiki Loves Monuments. It allows you to customize the Upload Wizard interface for a particular event. You can set your campaign to use custom templates, fields, and categories. You can also customize which licenses are available.
- Existing campaigns: http://commons.wikimedia.org/wiki/Special:UploadCampaigns
- Example: French WLM campaign: http://commons.wikimedia.org/w/index.php?title=Special:UploadWizard&campaign=wlm-fr
- Documentation: http://www.mediawiki.org/wiki/Extension:UploadWizard/Campaigns