Research:Identification of Unsourced Statements/API design research

From Meta, a Wikimedia project coordination wiki
Jump to navigation Jump to search
Created
21:59, 31 July 2019 (UTC)
Duration:  2019-August — ??
Noto Emoji Pie 1f4c6.svg

This page documents a planned research project.
Information may be incomplete and change before the project starts.


We will perform interviews with tool developers and potential tool users (such as members of WikiProject Unreferenced Articles on English Wikipedia) to understand how to design an unsourced statement classification API to make it most useful and usable to them.

We will also conduct secondary research (literature review) and consult with the designers of related APIs (e.g. ORES) to ensure that the design of the citation needed API and associated documentation reflects best practices and ethical AI values.

Research goals[edit]

This research project is focused on surfacing design requirements for an API that can be used to develop tools to help editors identify and correct unsourced statements. The result of this work will inform the specifications and documentation of the API to surface citation needed recommendations as well as future directions for this research

The goals of the project are:

  1. Identify best practices for API design and documentation to support use by external tool developers in a Wikimedia context such as extensibility and flexibility (supporting multiple relevant use cases and contexts of use)
  2. Identify design requirements for supporting algorithmic interpretability, auditing, and feedback by end-users


Timeline[edit]

September 3 - 6
  • Conduct literature review (including a review of previous research conducted by the PI, e.g. Growth and diversity of Technology team audiences, Research:Ethical AI) and write up key findings
  • Use key findings to draft interview protocol
  • Schedule consultation meeting with Aaron Halfaker to discuss ORES API design
  • Elicit suggestions from Scoring Platform team of tool developers to contact for interviews
  • Identify interview candidates from the member list of relevant English Wikipedia WikiProjects (e.g. Article Rescue Squadron)
  • Submit interview protocol for Legal review and work with Legal to draft consent form

Collaborators: WMF Legal; WMF Scoring Platform

Deliverables: Literature review summary (wiki page); draft interview protocol (wiki page); list of interview candidates (spreadsheet)

September 9 - 20
  • finalize interview protocol based on Legal input
  • conduct consultation meeting with Aaron Halfaker
  • summarize notes from meeting with Aaron Halfaker
  • send out invitations to interviewees (Target: 3 interviews)
  • conduct interviews
  • loosely transcribe interview data (doesn't need to be word for word)
  • summarize key points

Collaborators: WMF Legal; WMF Scoring Platform

Deliverables: Final interview protocol (wiki page); Interview recordings (audio and/or video files); interview transcripts (Google doc); consolidated consultation meeting notes (google doc); consolidated interview notes (Google doc)

September 23 - 30
  • publish findings, recommendations, and suggest next steps

Collaborators: n/a

Deliverables: Report of findings, recommendations, and implications (wiki page)

November 2019-?
(TBD) Potentially: additional interviews, API design specification, API documentation, API user testing

Policy, Ethics and Human Subjects Research[edit]

Interviews and surveys will be conducted according to the Wikimedia Foundation's policies for informed consent. All non-public data gathered during this research (including interview recordings and notes) will be shared and stored in accordance with the Wikimedia Foundation's data retention guidelines.

Desk research[edit]

Machine learning as a service API docs[edit]

Research on API design[edit]

  • Murphy, L., Alliyu, T., Macvean, A., Kery, M. B., & Myers, B. A. (2017). Preliminary Analysis of REST API Style Guidelines. PLATEAU’17 Workshop on Evaluation and Usability of Programming Languages and Tools, 1–9. Retrieved from http://www.cs.cmu.edu/~NatProg/papers/API-Usability-Styleguides-PLATEAU2017.pdf
  • Farooq, U., Welicki, L., & Zirkler, D. (2010). API Usability Peer Reviews: A Method for Evaluating the Usability of Application Programming Interfaces (pp. 2327–2336). ACM. https://doi.org/10.1145/1753326.1753677
  • Rama, G. M., & Kak, A. (2013). Some structural measures of API usability. Software - Practice and Experience. https://doi.org/10.1002/spe.2215
  • Stylos, J., Graf, B., Busse, D. K., Ziegler, C., Ehret, R., & Karstens, J. (2008). A case study of API redesign for improved usability. Proceedings - 2008 IEEE Symposium on Visual Languages and Human-Centric Computing, VL/HCC 2008, 189–192. https://doi.org/10.1109/VLHCC.2008.4639083
  • Watson, R. B. (2012). Development and application of a heuristic to assess trends in API documentation, 295. https://doi.org/10.1145/2379057.2379112
  • Vanschoren, J., van Rijn, J. N., Bischl, B., & Torgo, L. (2014). OpenML: networked science in machine learning. https://doi.org/10.1145/2641190.2641198
  • Petrillo, F., Merle, P., Moha, N., & Guéhéneuc, Y. G. (2016). Are REST APIs for cloud computing well-designed? An exploratory study. Lecture Notes in Computer Science (Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 9936 LNCS, 157–170. https://doi.org/10.1007/978-3-319-46295-0_10
  • Glassman, E. L., Zhang, T., Hartmann, B., & Kim, M. (2018). Visualizing API Usage Examples at Scale, 1–12. https://doi.org/10.1145/3173574.3174154

Results[edit]

TBD

See also[edit]