Grants:Project/Harej/Librarybase: an online reference library/Final
Welcome to this project's final report! This report shares the outcomes, impact and learnings from the grantee's project.
- 1 Part 1: The Project
- 2 Part 2: The Grant
- 3 Grantee reflection
Part 1: The Project
In a few short sentences, give the main highlights of what happened with your project. Please include a few key outcomes or learnings from your project in bullet points, for readers who may not make it all the way through your report.
Shortly after completing the midpoint report, I became a contractor for the Wikimedia Foundation. As this made me ineligible as a grantee, I discontinued my work on Librarybase and returned the remaining grant funds to the Foundation.
Please copy and paste the project goals from your proposal page. Under each goal, write at least three sentences about how you met that goal over the course of the project. Alternatively, if your goals changed, you may describe the change, list your new goals and explain how you met them, instead.
From the original proposal:
Our goal is to deliver software tools:
- To perform reference extraction on English Wikipedia as described above, using DOIs and at least one other strategy;
- To migrate this data to Wikidata or the Librarybase wiki;
- To allow users to perform lookups of this data;
- To generate source recommendations for at least five WikiProjects.
Important: The Wikimedia Foundation is no longer collecting Global Metrics for Project Grants. We are currently updating our pages to remove legacy references, but please ignore any that you encounter until we finish.
- In the first column of the table below, please copy and paste the measures you selected to help you evaluate your project's success (see the Project Impact section of your proposal). Please use one row for each measure. If you set a numeric target for the measure, please include the number.
- In the second column, describe your project's actual results. If you set a numeric target for the measure, please report numerically in this column. Otherwise, write a brief sentence summarizing your output or outcome for this measure.
- In the third column, you have the option to provide further explanation as needed. You may also add additional explanation below this table.
Looking back over your whole project, what did you achieve? Tell us the story of your achievements, your results, your outcomes. Focus on inspiring moments, tough challenges, interesting antecdotes or anything that highlights the outcomes of your project. Imagine that you are sharing with a friend about the achievements that matter most to you in your project.
- This should not be a list of what you did. You will be asked to provide that later in the Methods and Activities section.
- Consider your original goals as you write your project's story, but don't let them limit you. Your project may have important outcomes you weren't expecting. Please focus on the impact that you believe matters most.
If you used surveys to evaluate the success of your project, please provide a link(s) in this section, then briefly summarize your survey results in your own words. Include three interesting outputs or outcomes that the survey revealed.
Is there another way you would prefer to communicate the actual results of your project, as you understand them? You can do that here!
Methods and activities
Please provide a list of the main methods and activities through which you completed your project.
Please provide links to all public, online documents and other artifacts that you created during the course of this project. Even if you have linked to them elsewhere in this report, this section serves as a centralized archive for everything you created during your project. Examples include: meeting notes, participant lists, photos or graphics uploaded to Wikimedia Commons, template messages sent to participants, wiki pages, social media (Facebook groups, Twitter accounts), datasets, surveys, questionnaires, code repositories... If possible, include a brief summary with each link.
GitHub code repositories:
The best thing about trying something new is that you learn from it. We want to follow in your footsteps and learn along with you, and we want to know that you took enough risks in your project to have learned something really interesting! Think about what recommendations you have for others who may follow in your footsteps, and use the below sections to describe what worked and what didn’t.
What worked well
What did you try that was successful and you'd recommend others do? To help spread successful strategies so that they can be of use to others in the movement, rather than writing lots of text here, we'd like you to share your finding in the form of a link to a learning pattern.
- See original learning pattern: Grants:Learning patterns/Wikidata mass imports
What didn’t work
What did you try that you learned didn't work? What would you think about doing differently in the future? Please list these as short bullet points.
If you have additional recommendations or reflections that don’t fit into the above sections, please list them here.
Next steps and opportunities
Are there opportunities for future growth of this project, or new areas you have uncovered in the course of this grant that could be fruitful for more exploration (either by yourself, or others)? What ideas or suggestions do you have for future projects based on the work you’ve completed? Please list these as short bullet points.
- My original Wikidata-editing scripts need to be updated to reflect the accelerated growth of Wikidata over the past year. Because there are now several million journal article items on Wikidata, representing an extremely high ratio of Wikidata's growth in general, my scripts (and other dependencies, including the Wikidata Query Service) are not adequately performant, resulting in scripts taking too long to run and/or not run at all due to query timeouts.
- More work should be done on modeling data for publications other than journal articles. Also, more work can be done with "author disambiguation," i.e., associating author claims on documents with Wikidata entities representing those authors.
- The amazing work of the WikiCite community plays a significant role in the Knowledge Integrity program at the Wikimedia Foundation.
Part 2: The Grant
Please copy and paste the completed table from your project finances page. Check that you’ve listed the actual expenditures compared with what was originally planned. If there are differences between the planned and actual use of funds, please use the column provided to explain them.
|Expense||Approved amount||Actual funds spent||Difference|
|Management, strategy, and lookup tool development||$10,000||$5,000||$5,000|
|Citation extraction tool development and other code development work as needed||$10,000||$0||$10,000|
Do you have any unspent funds from the grant?
Please answer yes or no. If yes, list the amount you did not use and explain why.
Yes. I was only able to complete half of the grant, as during the second half I was hired as a contractor for the Wikimedia Foundation.
If you have unspent funds, they must be returned to WMF. Please see the instructions for returning unspent funds and indicate here if this is still in progress, or if this is already completed:
No additional funds were actually disbursed, so I don't think there are funds I need to remit.
Please answer yes or no. If no, include an explanation.
Confirmation of project status
Did you comply with the requirements specified by WMF in the grant agreement?
Please answer yes or no.
Is your project completed?
Please answer yes or no.
We’d love to hear any thoughts you have on what this project has meant to you, or how the experience of being a grantee has gone overall. Is there something that surprised you, or that you particularly enjoyed, or that you’ll do differently going forward as a result of the Project Grant experience? Please share it here!