Grants talk:APG/Proposals/2017-2018 round 1/Wiki Education Foundation/Proposal form

From Meta, a Wikimedia project coordination wiki

Thank you WikiEd, your proposal has been well received. We'll be reviewing it in the next few days. Delphine (WMF) (talk) 15:35, 2 October 2017 (UTC)[reply]

Questions[edit]

From Liam[edit]

Dear LiAnna (Wiki Ed), thank you for this detailed and professional proposal. If you don't mind, I've several questions that came to mind while reading this application:

Thanks Wittylama and Mike_Peel for the questions and for your careful review of our application. I know it’s long, and I appreciate your time to ask such thoughtful and detailed questions! Answers inline.
  • I didn't understand what was meant by the phrase "...if the instructor has added that they will grade only on what sticks on Wikipedia, not on what students did, we won’t approve the course page...". does that refer to deleted articles? Is that related to the later stats that "only 2.6% of our students' mainspace edits were reverted and only 1.9% were deleted.”? Wittylama (talk) 22:59, 26 October 2017 (UTC)[reply]
    • Earlier in the program, we had instructors who would tell their students that they would grade on what was in Wikipedia at the end of the course. This was problematic because some students, for example, somehow can't grasp the difference between a natural, fact-based encyclopedia article and an analytical essay. The student would do a lot of work putting together an analytical essay and put it into Wikipedia; it would quickly get reverted or deleted. Usually, then, the student would edit-war to keep putting the essay in and implore editors to leave it up until the instructor graded it since they were worried about their grade in the course. So we make it clear instructors must grade on what students actually do and not what “sticks” on Wikipedia; in this case, the student could submit the essay off-wiki instead, and would be graded on that (with presumably lower marks since they failed at writing neutrally, but not a failing grade since they did write a good analytical essay). We do everything we can in the approval process to ensure that the assignment design elements follow what we’ve determined are the best practices for having the least negative effects on the Wikipedia volunteer community, which is why we’re proud of the low reverted and deleted content numbers. --LiAnna (Wiki Ed) (talk) 18:11, 1 November 2017 (UTC)[reply]
  • How does a course get "selected" to be supported by your program? You speak of "closing content gaps" [on the English Wikipedia], which certainly would be an outcome of many of the courses subject areas, but do you proactively target professors who are teaching in areas where you've identified as a notable content gap? Furthermore - what (and how) have you identified those gaps? Wittylama (talk) 22:59, 26 October 2017 (UTC)[reply]
    • Frankly, with the possible exception of military history, there are pretty significant content gaps in all academic content areas on the English Wikipedia. That being said, we do choose to specifically target content areas in a couple of ways: (1) We foster partnerships with academic associations in certain disciplines. Some of these (e.g., Society for Marine Mammalogy) have identified content gaps on Wikipedia on their own and approach us to seek partnerships; others (e.g., National Women’s Studies Association) we know need work on Wikipedia, and we approach them. (2) We create themed initiatives where we target certain disciplines. We engaged in a "Year of Science" campaign for 2016 where we targeted STEM and social science classes (and targeted related partnerships, which is how the American Chemical Society is a partner). Currently, we’re engaged in what we call the "Future of Facts" campaign, where we’re targeting improving content in politically relevant topic areas, since Wikipedia is where an informed citizenry gets their knowledge about issues. That’s why we’ve recently signed a partnership with the American Studies Association, for example, and recently recruited courses at the American Political Science Association conference. Outside of our targeting, however, a lot of classes come to us through our visibility. For these classes, someone will hear about us from a colleague or do a web search for teaching with Wikipedia, or find us through some other means and simply create a course page through our Dashboard system without us knowing about them. Another common way is English Wikipedia editors will notice a disruptive class project not being supported by us and ask us to go help (e.g., earlier this week). We get anywhere from 50 to 100 new classes each term that just come to us, without us doing the outreach, so these are across all disciplines. --LiAnna (Wiki Ed) (talk) 18:11, 1 November 2017 (UTC)[reply]
  • Your budget allocates 5.7 FTE and $1M towards this classroom program (of which 40% is requested from this grant). You also are targeting to have engage 16,000 students this year and states cost-per-student of $62. Since it is not mentioned otherwise as a source of income, I assume that all interaction with these universities is undertaken without fees charged for your time to the institution? Can you describe what 'service' that $62-per-student buys them, and whether there are different 'levels' of support that you offer (especially since, if I'm not mistaken, I as an instructor could create a course without requiring permission/support from WikiEd)? Wittylama (talk) 22:59, 26 October 2017 (UTC)[reply]
    • Currently, all our students receive the same level of support (online trainings, printed brochures if the instructor requests them, access to our online resources, and support from our Wikipedia Content Experts on staff). You as an instructor absolutely can run a class assignment without our support, but unless you are a longtime Wikipedia editor, you’re likely to unknowingly run afoul of some unknown policy or guideline and get directed to us by an editor (see my above answer). We are in the process of exploring options for a fee for service model, which I’ll answer more in depth in response to your next question. --LiAnna (Wiki Ed) (talk) 18:11, 1 November 2017 (UTC)[reply]
  • Relatedly, since you're working with "institutions ranging from community colleges...to Ivy League schools" can you say whether you've attempted to have the richer of these institutions financially support your activities in their classrooms, enabling you cross-subsidise your work with the smaller and poorer educational institutions? For example, I see four courses offered at the University of Pennsylvania this current semester, the 7th richest university in the country (world?). Did allocating your staff resources towards that institution mean you did not have the capacity to support other, less well endowed, institutions? [edit: To clarify, this is not to say you shouldn't be working with such universities per se - but whether it represents value for money within the context of your organisation's scope: Education. As the FDC we deliberately do not compare between FDC applicants with a geographic scope (Wikimedia Chapters) and their country's costs of living - we look at the value for money within their organisation's scope. So, with reference to the scope of your organisation, what is "good value" when thinking about Educational outreach? For example, are you formally geographically and linguistically scoped to North America and English language (if so, where is that stated?) or are you formally scoped to the field of "higher education" broadly defined?] Wittylama (talk) 22:59, 26 October 2017 (UTC)[reply]
    • This is absolutely a great question and something we targeted in this year’s annual plan to figure out a solution to (you can even see our discussion about it on our draft annual plan talk page!). We are currently actively seeking funding for an organizational development grant that wound fund a business development exploration of what this would look like. Some questions we’re considering: Would this be a per-student fee (similar to the Lumen Learning model, since they’re our peers in the OER space)? Or would we have tiered support, and participants at institutions that pay an annual fee would have access to some additional support that we haven't defined yet? Or would we create some sort of continuing education certification program, where participants whose institutions pay a fee would get access to some certification for teaching with Wikipedia? Or can we get funded through the ed-tech space with our Dashboard software? Which department within the institution pays: the library, the teaching and learning center, the career center, etc.? What is a reasonable fee for an institution? There’s a lot to be worked out for this model, but we think it has potential, and we are expecting to receive a grant in the next couple months that will fund the work to determine what the most effective model will look like. In terms of the targeted outreach part of your question, though, we don’t actually do much targeting at the university level (with the current notable exception of HBCUs); instead, we do the vast majority of our outreach at academic association conferences. That means we seek, for example, instructors who are members of the Linguistic Society of America (LSA) to be part of our program. The diversity of course levels we support comes from the fact that LSA members teach everywhere from community colleges to Ivy League institutions, so our subject-matter focus casts a wide net on the types of institutions these classes come from. UPenn, your example, really shows this: one of the three current classes came from our LSA booth, one just showed up on our Dashboard one day without us doing any outreach to that person, and the third is taught by someone who has been editing Wikipedia since 2012. So none came specifically from any outreach that targeted UPenn. Our current focus on the U.S. and Canada, higher education, and English comes from our previous strategic direction; we actually began discussing whether this should be expanded or not for the next strategic direction at the board meeting we had this weekend, but have not reached consensus yet. --LiAnna (Wiki Ed) (talk) 18:11, 1 November 2017 (UTC)[reply]
  • Considering "instructor retention" (and relatedly, "number of courses") seem to be important ways you describe and define the success of your work, why are these not part of your grantee defined metrics - especially instead of "words added"? Equally, why is the "number of students" (16,000) listed in your Quantitative targets, 2018 rather than "number of instructors" (or "retained instructors")? Wittylama (talk) 22:59, 26 October 2017 (UTC)[reply]
    • Words added is the measurement of one of our main impact targets of our the program, which is quantity of content added (the other being quality, which we’ll discuss below). The number of students metric comes from the FDC-defined shared metrics (students = participants for this program). You’re right that we do measure numbers of instructors, retained instructors, and courses as internal metrics, and I could easily add those to our qualitative metrics box; I’m not actually sure why I didn’t. :) Should I just edit the proposal form to add them? Since this is my first time through the FDC process I don’t want to presume I should just edit the proposal, but I’m happy to add those goals in. --LiAnna (Wiki Ed) (talk) 18:11, 1 November 2017 (UTC)[reply]
  • I find this "wp10" score in ORES a fascinating method for achieving a measure of quality-improvement that is scalable. It is imperfect, as you acknowledge, but is better than anything else 'out there' other than manual assessment of each article. While I see that an individual article's progress can be interrogated within the dashboard (structural completeness), the two 'teal and lavendar' histograms you've included to visually demonstrate the improvement (notably of stubs -> mid-quality articles) of whole cohorts of articles are static graphs. Are the originals or interactive versions available somewhere, as these are effectively the linchpins to your qualitative metric? Also, is there any documentation about the utility/reliability of the "wp10" model other than the short paragraph at mw:ORES#Assessment scale support (and your 'introducing structural completeness' blogpost)? Wittylama (talk) 22:59, 26 October 2017 (UTC)[reply]
    • I’m glad you’re as fascinated by this as we are! The originals are in a part of our Dashboard software accessible to our staff only, but I really like the idea of making them visible to everyone, both on the class and campaign level. Sage Ross (our product manager) and I discussed what it would take technically to do this, and Sage has added it to the feature request list for the Dashboard. I’m also super interested in the idea of making it interactive so you could see the development over the course of the term. Sage said that’s probably even harder, but we added it to the feature request list too. We’re not sure how feasible it is, but stay tuned! In terms of the documentation, our understanding is that the ORES wp10 model is based on what Nettrom developed for the "Misalignment Between Supply and Demand of Quality Content in Peer Production Communities" paper. Halfak_(WMF) may also be able to point you to more specifics if you have questions. --LiAnna (Wiki Ed) (talk) 18:11, 1 November 2017 (UTC)[reply]
      • Since I got a ping I might as well chime in here. Yes, the wp10 model is based on the research published in the paper LiAnna (Wiki Ed) mentions, if you're interested in reading it a PDF is easy to find. That paper builds upon work we published at WikiSym in 2013, which is of lesser importance given that the more recent paper shows a strong performance improvement. Working with Halfak_(WMF) on developing the model specifically in ORES we have made some further performance improvements (e.g. better wikitext parsing, better datasets). The approach used in the Wiki Ed Dashboard is similar to the one Halfak_(WMF) used in the "Keilana Effect" paper (where the PDF is on Commons so I linked it directly). Recent research looking at predicting article quality in Wikipedia has shown that ORES performs well. More complex models can outperform it somewhat, but are not readily applicable (e.g. a recent Deep Learning paper showed performance improvement but cannot be deployed). That's what I have off the top of my head, if more information is needed it might be better to continue the discussion in a different thread (e.g. on my talk page)? Cheers, Nettrom (talk) 22:52, 1 November 2017 (UTC)[reply]
    • @Wittylama: Just wanted to follow up that we have made the ORES graphs available on a campaign level now, and deployed it to both our Dashboard and the P&E Dashboard (where it works for English Wikipedia programs). So you can follow along with our current term as students add the majority of their work over the next three weeks, or check out English Wikipedia initiatives tracked on the P&E Dashboard like Art + Feminism. --LiAnna (Wiki Ed) (talk) 22:44, 27 November 2017 (UTC)[reply]
  • The fact you've "tripled the number of classes we supported over a three year period" through the use of the dashboard is super impressive. Congratulations :-) I understand from the blue box titled "A note about the Program & Events Dashboard" that there has been a "fork" of the dashboard to two, separate but parallel tools. And if I understand the caveats described in that box, WikiEdFoundation will collaborate and share software to the WikimediaFoundation's dashboard when it is compatible (technically, and in terms of organisational needs), but you are not doing development work for the WMF specifically - correct? If I also understand correctly from your description, the primary reason that the WMF's has a "pared-down feature suite" is because the WMF's is localisable to different languages while yours is English-only? [correct me if I've misunderstood]. While I understand that supporting localisation increases the complexity for adding any other features, it also seems a shame - and a waste of movement resources - to have two effectively identical (but incompatible) pieces of software trying to achieve the same goal in the same manner. As the dashboard could arguably be described as your flagship product, and one for which you should be justifiably proud, could you elaborate on how you're ensuring it is useful to others in the wikiverse - and not only those lucky enough to be enrolled in courses affiliated with WikiEdFoundation, in USA universities, on En.WP? Wittylama (talk) 22:59, 26 October 2017 (UTC)[reply]
    • On the small bit about the Dashboard and whether Wiki Ed is doing or going to be doing development work for the Wikimedia P&E Dasboard (I deliberately call it that to make the disctinction between the Dashboard and the Wikimedia version of it) I want to bring the WMF explanation. When looking at WikiEd's eligibility for the APG-FDC process, we've decided that we wanted to keep the development and/or maintenance of the existing Wikimedia P&E Dasboard outside of the APG grant request. Our rationale behind this is that development and maintenance Wikimedia P&E Dasboard does not constitute a core focus of Wiki Ed, and has been conducted as a somewhat separate venture, regulated by ad hoc technical contracts. We wished to keep this flexibility and see the Wikimedia P&E Dasboard specific work be kept separate from this grant request. Hopefully this sheds some light on the rationale exposed in the blue box.Delphine (WMF) (talk) 08:33, 1 November 2017 (UTC)[reply]
    • Thanks for chiming in with the WMF perspective, Delphine. I’ll try to clarify a bit more too, since it’s a confusing situation. The P&E Dashboard isn’t technically a fork: it’s still the same code base, there are just features that are turned off for the P&E Dashboard. When we release features that are relevant for both (so when we build the campaign level ORES improvement public view feature mentioned above), we will also deploy them for the P&E Dashboard. It’s not the English that’s what is turned off; WM-NY, A+F, and a bunch of other English programs use the features we developed for our work (like the WhoColor stuff and the ORES scoring, since both technologies only work on some languages) on the P&E Dashboard. Instead, it’s features we specifically developed for our program management needs. One example that I hope can illustrate this better is our Dashboard has a button we can click on a course page that automatically adds information about the course (link to on-wiki course page, estimated number of participants, which Wikipedia Content Expert staff member is assigned to it, specific dates when things are supposed to be done, etc.) to our Salesforce database; then the Dashboard will go update the statistics about the course (number of students, number of articles edited, amount of content added, etc.) to that Salesforce record for the course throughout the term. This is a super useful feature for us to initially project out when students will be editing throughout the term based on dates, then after the term is over be able to do data analysis of impact of courses based on thing like course subject, new vs returning instructor, whether it’s an undergrad or graduate school course, etc. But this feature is only useful if you have our customized Salesforce database, which means it’s not helpful at all for any other movement group. So anything we develop for our Dashboard that could be potentially useful for the P&E Dashboard we deploy there too, and anything that’s developed for the P&E Dashboard gets deployed to our Dashboard too, since it’s not a fork. I hope this helps clarify but please do ask if you still have questions as I know it’s complicated! --LiAnna (Wiki Ed) (talk) 18:11, 1 November 2017 (UTC)[reply]
  • With regards to the Visiting Scholars program: I fully understand the point that this is a boutique program that does not massively-scale (which is by no means a criticism, that is just the nature of this kind of activity). However, I am struck by the fact that your target is to fill 15 positions and have a budget allocation of 270k for this - i.e. It will cost $18,000 per visiting-scholar to run this program. Aside from a "small honoraria" (a concept I endorse) the capital expenses of the program (access to the educational resources, desk space, insurance etc.) are borne by the university itself. Is it fair to say that the lion's share of this program's budget is the staff-time cost of liaising between the institution and community to find the right fit - i.e. $18,000 is the per-wikipedian cost of recruitment? At the risk of making reductio ad absurdum calculations, if the target of these 15 visiting scholars is to improve 435 articles by 10 ORES points (or, 29 articles per scholar), then that is a cost of $620 per-article-improved? Wittylama (talk) 22:59, 26 October 2017 (UTC)[reply]
    • Yes: the majority of the costs are the staff time of liaising between the institution and community to find that right fit. But the math gets a little more complicated than a straight per-article cost, because we don’t recruit a fully new batch of connections each year, so it’s more of a sunk cost than an annual cost. For example, User:Wehwalt, our Visiting Scholar at George Mason University, was originally paired with GMU when the Visiting Scholars program was run by WMF. Wiki Education spends minimal staff time to ensure his access is renewed each year, and he continues to produce Featured Articles using the sources from GMU that go into our impact numbers for this program. As we add more and more Scholars, the per-article cost decreases; we’ve only been running this program two years, so it’s still in the more-expensive phase. --LiAnna (Wiki Ed) (talk) 18:11, 1 November 2017 (UTC)[reply]

from Mike[edit]

Hello. I've been trying to look at the WikiEd statistics through the dashboard, but I'm finding it quite confusing. Taking spring 2017 articles (which presumably is a complete set now), and sorting by number of bytes added (naively, a proxy for looking at the most-improved articles?) finds en:Indigenous rights to land along rivers, which seems to be a redirect (and this actually seems to appear in the list I've linked to above a couple of times), cases like w:Open Space Accessibility in California that has also been expanded by non-student editors, and w:Howard Atwood Kelly where content was added but then another editor came along and removed it. I'm not picking on these articles specifically, but it would be good to understand how these kind of situations are taken into account in the statistics? Also, can you point out some of your favourite examples of articles that the program has significantly improved? Thanks. Mike Peel (talk) 10:51, 28 October 2017 (UTC)[reply]

Thanks for this question! It actually caused us to look into what’s happening with that list a bit more and add some new features to clarify things a bit on that view. :) With the indigenous rights article that had been at the top of the list, that was an article that a student in a particularly problematic class in the spring created. The article was eventually deleted and a redirect was created to the other article. (We made a series of programmatic changes this fall to hopefully head off problems like that class in the future.) Yesterday, we changed how the Dashboard displays deleted articles since that was confusing. If a student-created article gets deleted and replaced by a redirect, the deleted revisions will no longer contribute to the numbers; deleted articles without a redirect now have highlighting that indicates they were deleted. Deleted articles are not counted by the Dashboard toward our overall top line term statistics, but that articles list is intended to give a comprehensive list of all the pages touched by students. The duplicates in that list occur when students enrolled in two different classes edit the same article that term. For articles that are also expanded by other non-student editors, only the content added by students is counted toward the numbers (the Dashboard only pulls revisions where the editor is someone enrolled in the Dashboard). Situations like the Howard Atwood Kelly article, where another editor comes after and removes some of what the student did, are counted in the statistics, but we think this is generally offset in our overall numbers by students who forget to log in before editing and thus have their edits attributed to an IP or never actually enroll in the Dashboard, so their edits aren’t ever counted.
In terms of student work, I encourage you to check out our monthly reports, which link to examples of good student work each month. I asked in our online staff chat for personal favorites, and here's what our staff came up with, in alphabetical order: en:Coartación (slavery), en:Domestic violence in same-sex relationships, en:Greater long-nosed bat, en:History of medicine in France, en:Kalief Browder, and en:Machine learning in bioinformatics. Enjoy! --LiAnna (Wiki Ed) (talk) 18:20, 1 November 2017 (UTC)[reply]

from the WMF[edit]

  • Financials, current year: Table 2 shows that you have a year to date of 412k USD expenses, for a total at the end of the year projected of 2,427k USD. While I understand that some costs can't just be split evenly during the year (such as rent, which porbably is monthly, for example), reading this I understand that you plan to spend 2 million USD in just 4 months. Can you explain why this is so? I can't seem to find a good explanation for this anywhere else in the plan. Thanks. Delphine (WMF) (talk) 08:26, 3 November 2017 (UTC)[reply]
    • Ah, this is perhaps an interpretation problem. We defined "current year" as current fiscal year, which for us runs July 1 – June 30. So Table 2 gives the current fiscal year, in which we were at the end of month 3 when we submitted. As is on our detailed budget Google Doc, $1.357 million of the $2.427 total budget is for January 1 – June 30, 2018, so is part of this FDC application; the other $1.07 million has been spent/is planning to be spent from July 1 to December 31, 2017. Should I change the numbers in Table 2 to instead reflect calendar year 2017, instead of fiscal year 17–18? --LiAnna (Wiki Ed) (talk) 12:54, 4 November 2017 (UTC)[reply]
  • Financials, upcoming year: Table 5 and Table 8 seem to show that you'll be running a benefit, but it does not show anywhere else. Given that the tables might not give the space to list all assets (notably assets that may not be counted as reserves) and seeing that you don't plan to build any reserves, can you give me a rationale of where the money is going to go? Actually, the same question applies to your projected revenues for this year (current year) which are far above your planned expenses (a little less than 300k USD). Where will that money be used? Thanks. Delphine (WMF) (talk) 08:26, 3 November 2017 (UTC)[reply]
    • This represents the difference between fiscal years, calendar years, and grant periods. For example, we are anticipating receiving money this fall to support the Guided Editing project (so revenue will be in 2017), but this work on project will be for more than a year (you can see our expenses for this in the 2018 detailed budget, but you see no income for that project because the money for it enters in calendar year 2017). So we'll be entering FY 18-19 with a "surplus" but we also have project expenditures that we will need to still make in order to meet the goals we set out in that grant request, since it will be a multi-year project. We anticipate this will continue being the case at the end of 2018 as well; some of the money we'll be bringing in during 2018 will be for expenditures during 2019. It's not an operational reserve, it's simply funds allocated to a particular project that we've received in our bank account but the timing for that work hasn't finished yet. Does that help clarify? --LiAnna (Wiki Ed) (talk) 12:54, 4 November 2017 (UTC)[reply]

Thanks for the questions![edit]

Just wanted to let you all know I'm with our board at the in-person strategic planning meeting mentioned in the strategic planning section of the proposal; I welcome all the questions coming in and will respond to them early next week. --LiAnna (Wiki Ed) (talk) 16:07, 28 October 2017 (UTC)[reply]

Cofinancing[edit]

Similar to Liam's point above: it's great to hear most courses continue the next year, and I often use https://wikiedu.org/blog/ to show academics the value of Wikipedia assignments for their teaching, but then I'd expect at least those universities to recognise the value of this service with a financial contribution. Just as they're happy to spend hundreds or thousands of dollars per student per year on things like expensive subscriptions, equipment and whatnot, they should have no problem contributing a portion of those ~60 dollars per student. --Nemo 13:39, 12 November 2017 (UTC)[reply]

Thanks for your comment, Nemo_bis. Glad to hear you find our blog useful and agree with us that we should work on figuring out a model for universities to contribute part of the costs. --LiAnna (Wiki Ed) (talk) 20:01, 13 November 2017 (UTC)[reply]