Grants:IEG/Improve 'Upload to Commons' Android App/Renewal/Final
Welcome to this project's final report! This report shares the outcomes, impact and learnings from the Individual Engagement Grantee's 6-month project.
Part 1: The Project
In a few short sentences, give the main highlights of what happened with your project. Please include a few key outcomes or learnings from your project in bullet points, for readers who may not make it all the way through your report.
- Major improvements to Nearby feature, with new UI, direct uploads, real-time location tracking and Wikidata integration. New "Explore" feature, Notifications system, and Achievements page.
- 57,457 images were uploaded to Commons through the app during the grant period; a total of 20,098 distinct images uploaded through the app have been used in Wikimedia articles; 862 places that need photos had photos added to them via our new Nearby direct uploads.
Methods and activities
What did you do in project?
Please list and describe the activities you've undertaken during this grant. Since you already told us about the setup and first 3 months of activities in your midpoint report, feel free to link back to those sections to give your readers the background, rather than repeating yourself here, and mostly focus on what's happened since your midpoint report in this section.
The midpoint report covers our methods/workflow and all the new features/improvements that were implemented up to v2.6.7. In this report, we will cover the new features/improvements in v2.7 and beyond, that are relevant to this grant.
More details and screenshots can be found in the associated blog post for the v2.7 release.
New "Nearby Places that Need Pictures" UI with direct uploads (and associated category suggestions)
Users can upload a photo directly from the map or list of nearby places that need pictures. Below is an example workflow:
- Go to the map of Nearby Places and select the corresponding pin from there, or the corresponding item from the list
- Tap the camera or gallery button, and select or take an image as usual
- The title and description of the image is automatically pre-filled in the next screen, but you can edit it if you wish
- If that item has a Commons category associated with it, that category will be on the top of the list of suggested categories
Added Notifications activity to display Commons user talk messages
Any user talk messages that a user receives can now be viewed via the “Notifications” screen (can be accessed via the navigation drawer).
Added real-time location tracking in Nearby
The map of Nearby places now tracks a user's real-time position, moving the marker on the map as they move.
More details and screenshots can be found in the associated blog post for the v2.8 release.
Wikidata p18 edits via Nearby uploads
When users upload an image to a pin on the Nearby map or list, the image will be automatically added to the p18 property of the associated Wikidata item.
"Explore" feature to browse other images (including featured images) on Commons
This feature melds our implementation of a grant task (displaying featured images) with a GSoC project by Ujjwal Agrawal that was mentored by Neslihan Turan and Nicolas Raoul. Featured images are displayed by default in the "Explore" feature (grant task), but all images in Commons can be viewed by searching for a specific title or category (GSoC project). More details about an image can be viewed by tapping on it.
View user achievements and upload statistics
The new “Achievements” feature is part of a GSoC project by Tanvi Dadu, mentored by Vivek Maskara and myself. Accessible by tapping on the “trophy” icon next to your username, users can view their images uploaded, % of images not deleted, and images used, all of which are used to calculate their level.
More details and screenshots can be found in the associated blog post for the v2.9 release.
New main screen UI
Our new main screen UI features a more prominent Nearby tab and a floating action button for uploads. It also displays the nearest place that needs pictures (if location permissions and GPS are enabled), and alerts users if they have user talk messages that have not been read.
New upload UI with multiple uploads
Users can now do multiple uploads within the app itself, without needing to use the stock gallery workaround. They will need to input a title and description for every image, but can select categories once for all. This design decision was reached because we did not want to risk vandalism where someone uploads their entire camera roll in one shot, but we also wanted to provide some conveniences for genuine multiple uploaders. We may reconsider this in the future depending on feedback. The upload UI was also revamped.
At this stage, all of the new features proposed in this grant have been completed.
This release was predominantly funded by WMCZ, and the only relation to this grant was bugfixes and minor alterations. More details here.
This release contains the final bugfixes for this grant, but is predominantly based around the changes proposed in our new Project Grant. This release decision was made because some of the bugs found in the v2.9 and v2.10 releases were fixed by the code overhauls proposed in the new grant.
Outcomes and impact
What are the results of your project?
Please discuss the outcomes of your experiments or pilot, telling us what you created or changed (organized, built, grew, etc) as a result of your project.
Histogram of file actions made with the app per quarter
We are using the Commons app stats tool (maintained by Yusuke Matsubara) to visually track the file actions made using our app - uploads, edits, deletions, and overwrites. The histogram below was taken on 23 Aug 2019, and this grant ran from 2017 Q4 to 2019 Q3 (statistic collection still ongoing for this quarter).
During this grant, we have achieved the highest ever number of uploads made with the app, in 2018 Q2. The number of uploads made decreased slightly after this peak (we hypothesize that it was due to the stability issues that we faced in the second half of this grant, which we will elaborate on below), but still remained well above pre-grant levels, and the number of deletions and overwrites continued to decrease as the grant progressed.
Number of images used in Wikimedia projects
We used GLAMorous to obtain the total number of images uploaded via the Commons app that were used in Wikimedia articles. This measure is a good indicator of the usability of images uploaded via our app.
- Total image usages: 45833
- Distinct images used: 20098 (16.63% of all images of category)
Map of p18 edits made with our app
One of the major foci of this grant was the enhancement of the Nearby Places that need Photos feature, with the implementation of direct uploads. After the picture is successfully uploaded via a direct upload from Nearby, the app now adds the image to the Wikidata item by editing the P18 property. This feature aims to reduce the number of geo-located Wikidata items that lack pictures, and consequently to provide pictures for Wikipedia articles that lack them. The screenshot below displays these uploads on a map of the world, via the Wikidata Query Service. We're thrilled to see p18 edits made via our app from all over the world!
Progress towards stated goals
Please use the below table to:
- List each of your original measures of success (your targets) from your project plan.
- List the actual outcome that was achieved.
- Explain how your outcome compares with the original target. Did you reach your targets? Why or why not?
|Planned measure of success
(include numeric target, if applicable)
|All tasks completed||All tasks completed||Target reached|
|A total of more than 11,000 distinct images uploaded via the app are used in Wikimedia articles||20,098 distinct images uploaded via the app are used in Wikimedia articles as of 23 Aug 2019 (query)||Target reached and exceeded|
|More than 500 places that need photos will have photos added to them via the app||862 places that need photos had photos added to them via the app as of 23 Aug 2019 (query)||Target reached and exceeded|
|Less than 10% of images uploaded via the app over a 2 week period at the end of the grant are unusable (deleted)||3.45% of images uploaded via the app from 3 Aug - 16 Aug were deleted (source)||Target reached and exceeded|
|Overwrite incidents are reduced to <0.5% of all uploads (measured over a 2 week period at the end of the grant)||Overwrite incidents from 3 Aug - 16 Aug are reduced to 0.14% of all uploads (source)||Target reached|
Think back to your overall project goals. Do you feel you achieved your goals? Why or why not?
Yes, I feel like we achieved our project goals. All of the metrics of success were reached or exceeded, and the project has grown to a large extent. Unfortunately we did take a much longer time than expected to complete the project, but it is now done.
We are trying to understand the overall outcomes of the work being funded across all grantees. In addition to the measures of success for your specific program (in above section), please use the table below to let us know how your project contributed to the "Global Metrics." We know that not all projects will have results for each type of metric, so feel free to put "0" as often as necessary.
- Next to each metric, list the actual numerical outcome achieved through this project.
- Where necessary, explain the context behind your outcome. For example, if you were funded for a research project which resulted in 0 new images, your explanation might be "This project focused solely on participation and articles written/improved, the goal was not to collect images."
For more information and a sample, see Global Metrics.
|1. Number of active editors involved|
|2. Number of new editors||1728||number of users whose first contribution to Commons was via mobile app, from Nov 2017 to Aug 2019 [source]|
|3. Number of individuals involved|
|4. Number of new images/media added to Wikimedia articles/pages||57,457||number of images uploaded to Commons using our app from Nov 2017 to Aug 2019 [source]|
|5. Number of articles added or improved on Wikimedia projects|
|6. Absolute value of bytes added to or deleted from Wikimedia projects|
- Learning question
- Did your work increase the motivation of contributors, and how do you know?
Indicators of impact
Do you see any indication that your project has had impact towards Wikimedia's strategic priorities? We've provided 3 options below for the strategic priorities that IEG projects are mostly likely to impact. Select one or more that you think are relevant and share any measures of success you have that point to this impact. You might also consider any other kinds of impact you had not anticipated when you planned this project.
Option A: How did you increase participation in one or more Wikimedia projects?
Option B: How did you improve quality on one or more Wikimedia projects?
Option C: How did you increase the reach (readership) of one or more Wikimedia projects?
Please provide links to all public, online documents and other artifacts that you created during the course of this project. Examples include: meeting notes, participant lists, photos or graphics uploaded to Wikimedia Commons, template messages sent to participants, wiki pages, social media (Facebook groups, Twitter accounts), datasets, surveys, questionnaires, code repositories... If possible, include a brief summary with each link.
- GitHub page - we use GitHub as our code repository and for collaboration
- Website - landing page for the app
- Google Play store listing - where Android users can download the app
- Stats tool - for visualizing uploads and deletes
- My personal blog - contains regular updates on the app
The best thing about trying something new is that you learn from it. We want to follow in your footsteps and learn along with you, and we want to know that you took enough risks in your project to have learned something really interesting! Think about what recommendations you have for others who may follow in your footsteps, and use the below sections to describe what worked and what didn’t.
What worked well
What did you try that was successful and you'd recommend others do? To help spread successful strategies so that they can be of use to others in the movement, rather than writing lots of text here, we'd like you to share your finding in the form of a link to a learning pattern.
What didn’t work
What did you try that you learned didn't work? What would you think about doing differently in the future? Please list these as short bullet points.
A major issue that we faced was that the time spent troubleshooting, diagnosing, and fixing bugs increased exponentially as the app grew, and eventually equaled the time spent actually developing new features (which led to this grant taking double the time that was anticipated). There were a few common root causes of this phenomenon:
- Much of our core backend code is still based on its original incarnation from 5 years ago (a long time in the Android development world). This results in using deprecated libraries and API calls, not adhering to modern best practices, and lack of standardization as we have a patchwork of old and new code
- More new features resulted in more code, greater app complexity and more potential areas for bugs
- More users, especially in a userbase with devices and versions as diverse as Android's, resulted in more bugs being discovered
- Buggy pull requests by well-meaning new volunteers - we do our best to test all PRs but bugs do slip through, and we were not very vigilant with monitoring new libraries added by volunteers
- We have a high number of third-party components and libraries in our app, thus increasing complexity and the potential for difficult-to-solve bugs and crashes down the road
- Android updates by Google that render code obsolete or introduce problems - for instance the new Oreo release caused frequent crashes for users until we managed to isolate the cause
Some of these causes are unavoidable (more users and more new volunteers is always good, and Android updates are inevitable), but others can be tempered with better software development practices and stricter code quality measures. Thus, we learned two very valuable lessons from this grant:
- We need a more solid and consistent foundation to build new features on
- Feature/userbase expansion needs to be tempered with technical and code quality improvements that aid maintenance
These lessons were the basis for the next grant that we applied for.
If you have additional recommendations or reflections that don’t fit into the above sections, please list them here.
Next steps and opportunities
Are there opportunities for future growth of this project, or new areas you have uncovered in the course of this grant that could be fruitful for more exploration (either by yourself, or others)? What ideas or suggestions do you have for future projects based on the work you’ve completed? Please list these as short bullet points.
We applied for a Project Grant for our next steps, which are detailed here.
Part 2: The Grant
Please copy and paste the completed table from your project finances page. Check that you’ve listed the actual expenditures compared with what was originally planned. If there are differences between the planned and actual use of funds, please use the column provided to explain them.
|Expense||Approved amount||Actual funds spent||Difference|
|Salary (Josephine)||13800 USD||13800 USD||0|
|Salary (Vivek)||5070 USD||5070 USD||0|
|Salary (Neslihan)||7436 USD||7436 USD||0|
|Total||26306 USD||26306 USD||0|
Do you have any unspent funds from the grant?
Please answer yes or no. If yes, list the amount you did not use and explain why.
Please answer yes or no. If no, include an explanation.
Confirmation of project status
Did you comply with the requirements specified by WMF in the grant agreement?
Please answer yes or no.
Is your project completed?
Please answer yes or no.
We’d love to hear any thoughts you have on what this project has meant to you, or how the experience of being an IEGrantee has gone overall. Is there something that surprised you, or that you particularly enjoyed, or that you’ll do differently going forward as a result of the IEG experience? Please share it here!
This grant was our first attempt at scaling up from a one-person project to a three-person team, with more ambitious features than we had ever developed in the history of the app. We learned many lessons regarding code complexity and maintenance, but also ended up producing a product that we were proud of.