|Languages:||English · français|
Note: this text is essentially an automatic translation of the French version which is the reference.
Did you meet your goals? Are you happy with how the project went?
1) Migrate wikiscan.org to new servers
|Receiving and checking new servers||OK||There was a strange problem with CPU power reduction on a server that took a long time to diagnose but it was eventually fixed.|
|HTTPS support for all subdomains (wildcard)||OK|
|Setting up and configuring servers||OK||For the configuration of new servers, I saved time by adapting scripts that I use for other servers. I took the opportunity to update the system of the old server.|
|Ensure compatibility with PHP 7||OK||Notably replacement of list() = each(), count(null)...|
|Adapt the Wikiscan code to distribute wikis to two external databases||OK|
|Move databases to new servers||OK|
2) Include all Wikimedia wikis and make the necessary optimizations
|Optimize statistical queries of small wikis to minimize the number of queries||OK||
Requests on the replication of the Tools are quite slow, instead of reading the data day by day, they are now made month by month for small wikis.
|Optimize the servers that will have to manage more than 800 sites in parallel||OK||I have given priority to speed over security because the data are not critical and are regularly recalculated. For example for Mysql use of innodb_flush_log_log_at_trx_commit = 2 and transaction_isolation = READ-COMMITTED.|
|Add all Wikimedia wikis not currently included||OK||The total number of wikis has increased from 381 to 896.|
3) Adapt and optimize the code to the recent restructuring of the Mediawiki tables.
|Support for new how and actor tables||OK|
|Optimization of requests, especially for large wikis||Partial||This problem is more complex than expected: some updates sometimes fail when tool replication is slow, this concerns mainly very large wikis but also smaller wikis when there is a high robot activity. It would take several restructurings to automatically adjust the size of queries according to the number of lines to read.|
For now I have disabled the comment table joins for wikis that often remained blocked and some updates are less frequent.
Please report on your original project targets.
|Target outcome||Achieved outcome||Explanation|
|Wikis supported from 381 to 800+||896 supported wikis||All public wikis have been added, the list is visible from this page  (all wikis that had less than 100,000} editions were not included before, this corresponds to size 1 at the end of the table)|
Projects do not always go according to plan. Sharing what you learned can help you and others plan similar projects in the future. Help the movement learn from your experience by answering the following questions:
- What worked well?
- The reuse and adaptation of personal scripts to configure servers has been very useful. It allowed me to configure the two new servers in the same way and I was able to reuse them on the old one once the system update was done. The scripts use simple SSH and RSYNC commands, I no longer use Puppet.
- What did not work so well?
- The change of calculation mode from day to month for small wikis required other adaptations, there is a bug not solved in the calendar menu with the monthly graph of users.
- The slowness of requests on Wikimedia replication will require more development to have an intelligent system that can adapt to the situation, the overall size of a wiki is not a sufficient indicator because there is sometimes a very large influx of automatic contributions on small wikis and I have the impression that it is even slower in this case.
- The project/grant management took me longer than expected but I was able to save a little time on the installation of the servers.
- What would you do differently next time?
- Allow more time for project management.
Grant funds spent
Please describe how much grant money you spent for approved expenses, and tell us what you spent it on.
Total spent : 1780 €
|Description||Scheduled times||Hours used|
|1||Migrating wikiscan.org to new servers||21 h||18 h|
|2||Include all Wikimedia wikis||10 h||9.5 h|
|3||Adapt and optimize the code to Mediawiki restructuring||10 h||11 h|
|4||Project management||7 h||15.5h|
|Total||48 h||54 h|
Do you have any remaining grant funds?