Wiki Education Foundation/Quarterly reviews/2014-Q1 Digital Infrastructure

From Meta, a Wikimedia project coordination wiki
Presenter's slides

The following are notes from the Quarterly Review Meeting for the Wiki Education Foundation's Product Manager Digital Services Sage Ross on August 22, 2014

Sage Ross, presenting virtually, at the Quarterly Review.

Present: LiAnna Davis, Jami Mathewson, Sara Crouse, Frank Schulenburg
Participating remotely: Sage Ross

Please keep in mind that these minutes are mostly a rough transcript of what was said at the meeting, rather than a source of authoritative information. Consider referring to the presentation slides, blog posts, press releases, and other official material.


Overview/purpose (Sage): looking ahead at where we might go with technology to solve our problems/reach next level of programs across three areas: (1); (2) on-wiki user experience, (3) analytics.

Digital services roadmapping[edit]

Slide 4: Tools for helping people get started effectively/efficiently on Wikipedia:

  • Assignment design wizard
  • Ambassador matchmaking
  • Course page creation UX
  • Rights request UX
  • Improving on-wiki training – learning how to do things with guided tours through visual editor
  • Adding quizzes to training
  • Just in time on-wiki help material

Slide 5: Running and Monitoring courses

  • Article selection tool
  • Activity feed and on-wiki feedback workflow
  • Course dashboards
  • Students portfolios and grading tools
  • Plagiarism checking
  • Ambassador task management

Re: real-time guide Frank: we could invest in data analysis, a/b testing, etc. to see how to improve this time over time.
Sage: the getting started extension (MediaWiki) has a some “guide” elements but it’s more of an onboarding experience than support experience over time (the long term goal is for that extension to monitor editor’s progress over time and trigger events over time).
LiAnna: the requirements for “did you know” are very English-language centric so it is not as useful for the foundation whereas we are focused on en:wp.
Sage: it’s really focused on getting people from 0 to 1 edit and then from 1 to10 edits.
LiAnna: we’re interested in 50–100 [high quality] edits
Frank: what if we could trigger an appreciation system? i.e. you’ve reached level 2 now and here is what you would need to do to reach level 3. Some gamification elements could be helpful (i.e. here is how you, user, are comparing to the others’ in your group). We have a wealth of things we could do, what could we do, [feature-wise] that would help us get the most out of the system? Also, the way we do it will be tailored to the [existing] community.

Re: article selection tool Sage: this tool would make it easier for instructors to focus on under-developed articles that are well-aligned with their class topics.

Re: activity feed Sage: we are working on an on-wiki activity feed whose design allows people to sort by group or by user (so you can see how many edits a specific student made). After this project, it would be helpful to incorporate ways of giving feedback on the edits (a tool for making comments on the edit that would be useful for both community editors and people who are monitoring class progress)

Re: dashboard [Summary]: Ideally, we will automate retrieval and display of course data/metrics (vs. current process which is labor intensive) by class.

Re: tools for ambassador task management [Summary]: different ambassadors like to do different things in terms of offering help: some do the basics, such as welcoming new users; some enjoy giving detailed feedback to students (when they are asked to give feedback); some advise instructors on tools for planning their course. We can look to task management workflows on Wikipedia, which work well.
Sage: the tricky part is making it really easy to add tasks (normally you have to be fluent in way that Wikipedia works). We want to populate those tasks with input from people who are new to Wikipedia.
Frank: at a high level, there are a lot of things that go into giving direction (intervention, etc.) but the other option is to not only correct things that are wrong, but to react to things that are right to strengthen certain behavior patterns. The echo/thank you feature for instance seems to be well received by the community. What are the ways to give positive feedback to student?
Sage: positive feedback mechanism could be part of ambassador task management.
LiAnna: …“thank button”…
Sage: there should be both the ability to leave a comment (on wiki feedback) and to “thank”. “Thanks” and “reply/comment” could be built-in as a gadget.

Slide 6: tracking our key metrics

  • Getting article quality data
  • Automating our routine metrics
  • Program-wide dashboards

Sage: re: tracking key metrics: the ability to quantitatively track article quality is an important and ongoing issue. We want to be able to track more than existing measures, such as page views i.e. to be able to see that participants took x number of articles from stub to B Class; and to have program-wide data that is always current, so that progress being made during a term can be tracked in real time.
Jami: it would also be helpful to pull data based on other factors than those we currently use, such as by professor, by subject area.
Frank: there is no shortage of what could be improved in this area.

Slide 4: Tech roadmap: open questions:

  • What else should be on our radar?
  • Which development projects prioritize?
  • What can and should be done with MediaWiki (vs.
  • What other technical needs do we have?

Sage: we need to continually think about what we should we be doing in MediaWiki and what we should be doing outside of MediaWiki.
LiAnna: one of the things that should be on our radar is that we’re starting other programs coming up in the next few months and that we don’t know what technical needs for those programs are at present. Let’s note that we may need technical development and support for things outside of just the classroom program.
Jami: yes, such as honors/high achievers program.
LiAnna: the course dashboard is really important but I’d like to see it expanded into the cohort dashboard so we don’t have to base everything around the classroom model.
Sara: gave examples of some existing sites with good annotation and real-time feedback sites (not on MediaWiki).

Progress on[edit]

Slide 9: Writing the RFP Scope:

  • Assignment design wizard
  • Basic course dashboard
  • Plagiarism checking feasibility study


  • Explain the context and long term vision
  • Emphasize UX
  • Don’t over-specify solutions
  • Find a potential long term partner

Learning points

  • There is no standard format for doing an RFP
  • Emphasis on why we are doing what we are doing

Slide 10: Running the RFP

  • Promotion and communication
  • Proposals received (4 Seattle, 1 SF, 1 Germany)
  • Learning points
    • Many companies don’t do fixed-bid projects these days
    • Inviting interesting companies to bid is effective
    • Fairly consistent estimates from the better firms
    • Ruby and Python shops are main ones for design-centric projects
    • Best proposals driven by interest in project and brand [wp]
    • Typical rates $80-$175/hr for designers and developers
    • Plan for plenty of time to settle contract before work starts

Sage: we decided to go with WINTR – branding and marketing focused, but everything they do has a strong interactive/UX component to it and is polished; they are also a good cultural fit. WINTR has a strong emphasis on scoping at the beginning so they have figured out how to narrow the work down to the minimum viable product.
Sage: from mid–Sept. through Oct., the main objective of work with WINTR is to get the assignment design wizard up and running; they may also get a basic version of a working course dashboard.
Frank: Sara and Sage coordinate closely with regard to progress on design wizard so we are on track w/grant commitments.
LiAnna: very pleased with how the RFP process and WINTR engagement has gone. Sage did a great job at identifying the best option.
Frank: pleased with Sage’s strong focus on UX and establishing a new standard for user experience.

Slide 11: WordPress

  • Fixing rough edges
  • Staff and board profiles
  • For instructors page
  • Open questions:
    • What else do we need from the WordPress side of
    • Should we work with WordPress professionals – for design in particular – or can we meet our needs in-house and with current contractors

Frank: let’s keep WordPress as is – serving its purpose of informing – maybe build out a few features.
Sara: need donations pages and improved info pages about our organization.
LiAnna: will turn over WordPress to Eryk.
Sara: what’s the vision for hub and website? What is driving what?
Frank: what WINTR does will set the standards for our website.
Frank: re: Office Wiki: Sage did great job at setting it up. All agreed that they are happy with how it is.

Overview of on-wiki user experience work to date[edit]

Slide 14: MediaWiki Work

  • Implementation to the education program API
    • Course name and dates
    • Assigned articles
  • Activity Feed
  • Open questions

Discussion re: Inspect Diff tool on MediaWiki: may be able to have a nicer version of this if mobile at WMF develops something good.
LiAnna: building “thank” into activity feed (w/Inspect Diff) would be ideal.

Slide 25: Assignment design wizard

  • Need help with copy and refining the content details
  • Testing, iterative improvements (instructor
  • Success criteria for the development project: when I am confident enough that this will improve the experience for instructors that I change the online wiki workflow to use it

Slide 26: Course dashboard

  • What will minimum viable product look like?
  • Review and brainstorm potential enhancements
  • User testing (with professors new to WP)
  • Prioritization of additional features

Slide 27: Plagiarism Checking

  • How does Wiki Med’s project impact our plans?
    • We may start from scratch/initial bot up and running but not yet accurate enough for our use case.
    • Sage will keep an eye on it to see whether it will make a useful starting point for our planned plagiarism checking features

Sage: Wiki Med has launched its own project around this and there is bot up and running now but it is too early to know now if it will be worth building on…lots of false positive).

Next three months’ priorities[edit]

Slide 29: MediaWiki work

  • Get activity feed finished and deployed
  • If successful, plan more MediaWiki work
    • Potentially cooperate with WMF WEP team (share problems)
    • Plenty of bugs worth fixing

Slide 30: User Testing & Interviews

  • Need quick access to pool of instructors (especially newcomers), ambassadors and students willing to try out our online systems
  • Need to talk in depth with instructors, ambassadors,, and (once they find their feet) the Wikipedia Content Experts to identify key technical that can impact our key indicators (quality of content, rate of incidents, instructor retention)

Slide 32: Getting article quality data

  • Idea: use a crowdsourcing platform and monetary incentives for experienced editors to generate quality data on Classroom Program articles
  • Idea: create a tool to algorithmically estimate article quality

Sage: potential to work with company like that of the co-developer from Mechanical Turk…
Jami: would it be considered biased if we paid people to rate articles?
Sage: thinks it would be fine; it would be people who already have experience rating articles.
Jami: could be used going forward to tell professors what their courses’ quality was.

Slide 33: Course Page UI Overhaul

  • Idea: Get a professional web designer to make a strictly cosmetic overhaul of the course page layout, hire WikiWorks or HalloWelt! to implement it.

Frank: Very pleased with the achievements. Digital Infrastructure is off to a very good start.