Jump to content

Grants talk:IEG/Committee/Workroom

Add topic
From Meta, a Wikimedia project coordination wiki
Latest comment: 9 years ago by Rubin16 in topic Leaving from the committee

Individual Engagement Grants

committee workroom discussion


Guidelines for multiple IEGs to same person

[edit]

Hi all,

Would the committee would like to develop some more concrete guidelines about how we handle multiple grants to 1 grantee going forward? It seems that the situation may come up more frequently as the program grows.

History and current status:

  • During the pilot round of IEG, we had a brief discussion about making multiple grants to 1 person, when the committee recommended 2 grants be made to 1 user. In the end, we did make both of those grants.
  • Now in round 2, I see that we have at least 4 different individuals who are listed as potential grantees on more than 1 proposal.
  • We also have a case where a round 1 grantee is listed as a potential grantee on another proposal in round 2.
  • And finally, we've also said that we're willing to renew some 6 month IEGs for an additional 6 months if impact and need for continued funding is demonstrated.
  • What this all means, effectively, is that we could end up in situations where someone has multiple grants running at a time, for multiple rounds in a row.

On one hand, I think these are generally good signs that active community members are excited with the program, see IEG as a suitable avenue for support, and want to share their ideas for projects with us. On the other hand, it raises some difficult questions around diversity of grantees and the extent to which we think it is important that these funds be distributed as widely as possible across the movement.

Background context:

  • During the round 1 discussion about Ocaasi's grants, as I recall, it was decided that we mostly felt ok making multiple grants to 1 person because a) IEG is primarily concerned with funding proposals for projects that we think will have the most potential impact on the movement (who being perhaps secondary to what, therefore) and b) the 2 projects combined in that case for 1 grantee totaled less than $30,000, which is the maximum cap that we'd set for any single grant - in effect the committee and WMF were willing to fund 1 person to do multiple small projects instead of 1 person to do 1 larger project.
  • The GAC doesn't place limits that I know of on multiple or repeat grantees, nor does the Participation Support Program. In some cases, they may even consider a grantee's track record of doing well with a previous grant to be a positive factor in making a second grant to the same user. However, the context is different for the GAC because a) there hasn't been any element of scarcity or competition to date for those grants - ie, if they have 50 good grant proposals over a rolling application period, they may fund all 50 of them, whereas in IEG one more grant for person A means one less grant for person B and b) Project and Events Grants are often a venue for organizational capacity-building in which repeat granting is a path to a chapter's growth. IEG may or may not be similarly a venue for individual leadership capacity building in our movement, depending on how you look at it.

Questions to spark discussion:

  • Do you see IEG as a venue for investing in leadership capacity and growth of individuals in our movement? Is it an investment in good ideas where we find them? Is it an opportunity to distribute wealth across the movement to as many folks as possible? We likely have a range of different views about the role and purpose of these grants and so I'd be curious to hear your thoughts.
  • Do you think there should be a cap on the number of IEGrants any 1 person can have (and if so, in terms of number of grants at a time, forever, or based on a dollar amount)?

Looking forward to hearing your thoughts! Cheers Siko (WMF) (talk) 17:22, 17 October 2013 (UTC)Reply

Personal opinion: At a bare minimum, I think that no IEGrantee should have more than 2 IEG projects or $30,000 in IEG funding at any given time. This is because a) more than 2 projects at a time is likely to impact a grantee's ability to be successful, and takes up more grantee slots than I'm comfortable with any 1 person having, and b) $30,000 is the cap we've set for any 1 grant, and I don't think it would be fair for 1 grantee to receive more than this cap. I would be open to more restrictive guidelines than this, depending on where this discussion goes, but I don't think I'd be comfortable with anything less restrictive than this. Siko (WMF) (talk) 17:27, 17 October 2013 (UTC)Reply
As someone this relates to directly, I'll try and make some unbiased comments. I have benefitted greatly from being able to work on more than one grant at a time, and it's empowered me to devote far more of my schedule to Wikipedia exclusively and sustainably. I think we have to balance giving grants to folks who have demonstrated competency and who have great ideas, while also encouraging diversity of grantees and ideas and do so while minimizing risk of concentration. There are competing pragmatisms here: 1) editors with good ideas who can realize them are a value we should cultivate, but 2) we want to cultivate those skills in lots of people, some of whom may need to go through a process of learning over time. Because of this, I think a reasonable guideline is that no one person should have more than 2 grants and 30,000k going at any one time. I think that simple numerical rule could avoid a lot of issues that could come up repeatedly. Ocaasi (talk) 18:53, 17 October 2013 (UTC)Reply
  • I agree with Siko, 2 grants or $30,000. I had the same in mind before even reading Siko's opinion. In addition, that person shouldn't given any till the end of both the projects. And if we need to make it more restrictive, then how about "A grantee shouldn't have more than 3 projects funded in 2 consecutive rounds?" -- ɑηsuмaη «Talk» 19:47, 17 October 2013 (UTC)Reply
  • If there is a limit to the amount of grant money, and there are other projects which could be turned down only because the funding limit is reached, then I think that if a person has two approved grants they should pick one to wait for the next round. If there's no money problem that wouldn't be an issue. If a person is in the middle of one grant from a previous round and wants another, I think it should depend on whether they are keeping up with the timeline for the first grant. Anne Delong (talk) 04:34, 18 October 2013 (UTC)Reply
  • I agree with having the restriction of up to 2 grants and up to $30,000 in total for one grantee in a term. I think consecutive or extended grants could be judged case-by-case basis. That said, they may have a harder time in the review than in the first application's review, because, as I see it, the committee tends to seek something new, not just something good but already tried. If a project gets established and needs additional funding to keep producing its outcomes, GAC grants may be more suited than IEG. --whym (talk) 02:16, 19 October 2013 (UTC)Reply
  • I'm between Anne and Whym. I agree with the $30,000 limit per grantee in a term but I think having multiple small grants could be ok and think we shouldn't set a slot limit arbitrarily. I agree with Whym that we should keep in mind that we can refer people to GAC. --Pine 06:43, 19 October 2013 (UTC)Reply
  • My opinions on these matters are that grantee should be limited with 2 approved proposals per round. Grantee should focus on as best as possible realization and success on as least as possible projects in the given period of time (2 grants per round). The other ideas and suggestions he should propose in other round(s). Regarding the finances, I would not like to put a limit in numbers (except maybe in the range of the total fees from all of the projects per round) because project expenses may vary in range due to the length and the range of the work. --MikyM (talk) 21:31, 19 October 2013 (UTC)Reply
  • While I do understand the viewpoint that sharing the IEG funds around in a diverse way is a factor we need to consider, I think we should look at proposed grants based on their potential impact to the movement more than who is doing them. I do think as part of due diligence we should ensure that a grantee will have enough time to work on multiple grants if successful, and that this should perhaps be part of the committee review as well as the staff review, but I don't think we should have an automatic decline on a grant because the applicant has 2 in the round, and a third one, while an excellent idea, can't be approved as they already have two. If there were other grant applications that were in theory awful, I don't think we should approve them and not the third of the applicant. In short, I don't think we should impose any hard and fast rules on the amount of grants that can be given in a round to one applicant, as long as they have the time to successfully execute all of them. A limit of $30,000 makes sense - I see the possibility of multiple grants being smaller projects, not large ones that the total budget would be over $30k. Steven Zhang (talk) 22:31, 19 October 2013 (UTC)Reply
  • I hate all arbitrary limits, and thus don't see any problem with multiple grants and tons of cash going to one grantee, as long as the ideas are awesome and help move the projects forward in the same general direction as the overall strategy guidelines dictate. In fact I wasn't even aware of the $30,000 limit to begin with. That said, I am fine agreeing to 2 grants or $30,000. Whatever is decided, I do feel there need to be good dashboards in place so that you can pull the plug on a grantee who is under performing or under delivering. The only risk is when major amounts of cash are being paid up front, such as for expenses for people to travel who have not delivered anything yet. These are those "leap-of-faith" cases which you base on the person's "track-record", whatever that may be. As long as you pay-as-the-grant-goes-forward or pay-on-delivery this shouldn't be a problem. I think that the IEG process is a great initiative and I support this completely, but it is definitely still in a start-up phase. In this phase, I think it is only realistic to assume the financial risk of success/failure should lie 50% with us, and 50% with the grantees. So sure, go ahead and fund people, some of them multiple times, but track your overall financial risk while doing so. For each additional grant, you could increase the financial risk on behalf of the grantee, thus reducing the overall burden of risk for the IEG budget. This in itself would be a strong motivator for the grantee to wait until the next round before moving forward with the second grant application. I think you can make some general guidelines for that though, that will lead to approval decisions like "though grantee has requested 80% of the funding up front, the "candidate-probability-score-of-success" in our guidelines restrict us in this case to only paying 30% up front and not a penny more until delivery". If we decide to get seriously burned in the first five years as part of a learning process in how to trust grantees, we will get a lot of backlash in the community. I have never been on the approvals side of such decisions before in a professional capacity, but I have tracked large project portfolios in the past and have seen how small projects aggregated together tend to lose more money than the big ones. Generally there are more eyes on the big projects and as the saying goes; "when the cat's away the mice can play". Whenever a grantee has more than one project running, the reports should be merged somehow so that the overall financial risk to our budget is transparent for everyone. However we feel about approving multiple grants to the same person, we also need to take into account the trust that the community has in us and how financial failures would come across, keeping in mind that the typical community member is not going to bother reading all of our guidelines and policies. Our "financial failures" will also need to be publicly confessed, and these are the grants made that result in zero or less-than-promised deliverables. Of course I realize that our "grant review rubric" is currently lacking a financial column, but I think we should try to make one in order to be able to make our "financial risk" trackable. Jane023 (talk) 09:27, 20 October 2013 (UTC)Reply
Jane - just to briefly respond to a couple of your points about risk and payments: We do practice a 2 disbursement method for IEGs, which means the first half of the funds are provided up front, since many of our individual grantees aren't able to fund their own projects in advance (that's why they've come to us, oftentimes). And then the second half of the money is handed to them once they've completed a midpoint report and it has been accepted. This mitigates some of the risk as you're suggesting, and seems to have worked pretty well so far. We've also added criteria J around the risk in general for this round, which does include financial risk in the calculation. And then, too, systems like capping the total grant size at $30,000 per project, having grantees write monthly reports, and WMF staff doing due diligence and holding monthly check-ins with them on their projects are other risk mitigation techniques we've tried thus far. After all the round 1 grantees finish their projects, let's re-visit risk together as a group to see if we think these existing controls are working based on the data, or if there need to be further improvements. Sound good? Siko (WMF) (talk) 01:41, 23 October 2013 (UTC)Reply
  • I don't like the limits, but in this case I like 2 grants per year, with the limit of $50K. Why?, the grantee must have the time (Human Hour, HH) to develop the project and if they has 2 or more grants at same time, the quality of the projects would be affected, because the HH is divided into the quantity of projects, impacting in the results that wouldn't be reachable. About the money limit, if the grantee is requesting some amount of money to develop two projects with other golas (ex: one grant is requested to develop a Mediawiki tool and another for community study), spend about 50K in both grants sounds fine to me, because is a huge amount to develop both projects and this way to fund projects is for individual projects. If the idea(s) is pretty good, but needs a lot of funds, the grantee should ask to GAC, as the "big brother" of us. Regards Superzerocool (talk) 14:06, 20 October 2013 (UTC)Reply
  • I agree that a maximum of $30,000 per person should remain the limit even with several projects. Still the grants should focus on ideas, not on persons. How many projects one grantee can have should imho be considered from case to case as there are so many possibilities how to spend them in teams and costs apart from working time. --Ailura (talk) 18:09, 20 October 2013 (UTC)Reply
  • I am with most of you - multiple grants should be possible if the ideas and works are reasonable and if the person can claim that he is willing and able to do several projects in parallel. A limit could be better $50.000 than $30.000 to have room to work on the ideas and projects. -- Achim Raschka (talk) 13:17, 21 October 2013 (UTC)Reply

Thanks, everyone, for sharing initial thoughts around this! I'm not interested in having arbitrary rules either, but I do agree with Ocaasi's point that simple numerical guidelines can help avoid repeated issues, ease wider community concerns that may arise, and help us not spend lots of time on the same discussion in each round/case. I hope a discussion like this might help us come up with a basic guideline that makes sense and makes all of our work easier in the long run, while ensuring that we're being good stewards of movement resources. :)

We've definitely got some different views about how to handle this! So, here is what I think I'm hearing so far:

cap at 2 grants and $30,000 at a time
  1. Ocaasi
  2. Ansuman
  3. whym (+consider extensions/renewals separately or case-by-case)
  4. Jane023
cap at $30,000 per round
  1. Pine
  2. Steven Zhang
  3. Ailura
cap at $50,000 per round
  1. Achim Raschka
cap at 2 grants and $50,000
  1. Superzerocool (but prefers none)
cap at 2 grants per round
  1. MikyM
cap at 1 grant at a time
  1. Anne Delong
also some preference for no limits
  1. Superzerocool
  2. Jane023

Based on what we're seeing here from each other here, does anyone have additional thoughts, opinions, or suggestions to add to help bring us closer to a decision?

One additional thought I have: guidelines do not have to be hard rules. For example, we might decide that we feel strongly about having some sort of firm cap on the dollar amount, but prefer to put a soft cap on the number of grants, where we'd be more willing to make case-by-case exceptions. Or we might decide to set both kinds of caps as general guidelines, and say that in exceptional circumstances with good rationale we'd go above either kind of cap...

Curious to hear your thoughts! Siko (WMF) (talk) 01:41, 23 October 2013 (UTC)Reply

(moved from email thread) I'm curious how we would think about and calculate a situtation where a multiple grantee had at least one grant with more than one person as well. For example, this round I proposed Generation Wikipedia with a budget of $20,000 but I would split all of the project management with Keilana. Would that mean if I was fortunate enough to work on that--plus a hypothetical second grant--that the second would have to be capped at $10,000? Or would we only count the Generation Wikipedia grant for half ($10,000) freeing up $20,000 for an additional project.
I'm just using this as an example because I want to make sure the 2 grant / $30,000 cap that's being discussed would permit enough flexibility to work on two meaningful projects at the same time. The context here is that I hope to devote my full-time to projects, where the community thinks it's useful of course, and these kinds of budgeting questions are things I have to consider if I am able to work solely on projects that impact Wikipedia. Ocaasi (talk) 20:26, 25 October 2013 (UTC)Reply
(move from email thread with permission) This is a good clarifying question, thanks for asking it. Some background info from the operational side of things that might be helpful for this discussion:
When 2 people share an IEG project, WMF usually splits the disbursement equally among them (or if not equally, in a percentage appropriate to each grantee's role in the project) and creates a separate grant agreement for each person. In effect, then, each person has their own smaller grant, which is a sub-set of the total project amount. In the current Wikisource grant, for example, Micru and Aubrey each signed a grant agreement for half of the total project amount, and they each have responsibility for just half of the funds (although they together file single reports, etc). To my mind, this means that we could consider Ocaasi's amount for a Generation Wikipedia grant to be $10,000, rather than the total project amount of $20,000, if the committee thought that made most sense. --Siko 20:26, 25 October 2013 (UTC)
Sorry my late comment. I agree in principle that the IEG funds the projects not the person but I think that a person can lead a limited number of projects (in my opinion maximum 2) because the project leadership takes time, the leaderships of more than two projects can create a high risks in the deliveries. The limit per money may be interesting in order to limit the size of the project (basically three small projects or one big projects), but in my opinion is not the best solution even if is an appropriate solution for now. --Ilario (talk) 15:15, 4 November 2013 (UTC)Reply

Proposal

[edit]

What do folks think about this, as a solution, given that we've still got a variety of different views on the matter?

"As a general guideline, a single IEGrantee should be responsible for no more than 2 projects and $30,000 in funding at any given time. This total should take into account the grantee's individual portion of responsibility for all IEG project renewals, extensions, and new projects underway in any given round. The committee may choose to recommend exceptions to this limit, however, when there is good rationale based on the specific circumstances."

Siko (WMF) (talk) 00:38, 19 November 2013 (UTC)Reply

Scoring process, including handling of recusals and working groups

[edit]

Based on discussions so far, and looking at other scheduling, this is what I think an updated scoring process looks like in this round:

  1. Working group members each submit scores individually using the Google form for their group (Siko to provide this form via email, as in last round) (23 Oct - 3 Nov)
  2. Siko anonymizes the results, shares back scoring form outputs spreadsheet with full text comments to each working group, and posts in the working group's Google doc a summary of the results. (5 Nov)
  3. Working groups (with Committee review coordinator and Siko's help as needed) prep the results templates with scores and summarized feedback for each proposal on iegcom-wiki (5 Nov-13 Nov)
  4. Committee review coordinator pastes results templates to each proposal on meta-wiki (13 Nov)
  5. Working groups discuss in Google docs to form a working group recommendation (9 Nov - 13 Nov)
  6. Crat blocks recused committee members from iegcom-wiki and mailing list (14 Nov)
  7. Siko aggregates up the working group recommendations to the entire committee on iegcom-wiki (14 Nov - 18 Nov)
  8. Committee (minus recused) discuss on iegcom-wiki and form the final shortlist recommendation ( 18 Nov - 24 Nov)
  9. Siko takes that shortlist and WMF completes due diligences and processes final grant approvals (25 Nov - 15 Dec)

Thoughts? Errors? Omissions? Improvements? Siko (WMF) (talk) 23:25, 17 October 2013 (UTC)Reply

  • I have no idea of how to use a Google form so I hope it's not complicated. Also, for discussions that need to be private, but better formatted than an e-mail list, has anyone considered a wiki? Four little generic wikis with passwords would do it, with a page inside for each proposal. Everyone would already know how to edit them. I used DokuWiki to coordinate a large conference committee last spring and it worked well. They could be reused for the next round. Anne Delong (talk) 04:51, 18 October 2013 (UTC)Reply
Another thought, with an eye for streamlining: Is there a really good reason the working groups would like to have a deliberation and make sub-recommendations before the entire committee deliberates? We could simplify this significantly if the working groups scored and worked together to make the templates with summarized feedback for each proposal, but didn't deliberate to make a working group recommendation (then we don't even have to decide on how you discuss as a working group!). Skipping step 5, we could just block the recused after scoring and templates are done, and have everyone left just deliberate together on what rose to the top from the scoring (similar to what we did last time). Pine, I can't remember now why when we discussed this at the end of round 1 we thought there should be 2 deliberations - what do you think? I feel like we may be getting overcomplicated here :) Siko (WMF) (talk) 16:24, 18 October 2013 (UTC)Reply
The more I think about this, the more I think it would make my life, and probably yours too, easier, without sacrificing anything critical. Let me know by Monday if anyone would be opposed to this? Siko (WMF) (talk) 16:40, 18 October 2013 (UTC)Reply
OK, how about this?
  1. Working group members each submit scores individually using the Google form for their group (Siko to provide this form via email, as in last round) (23 Oct - 3 Nov)
  2. Siko anonymizes the results, shares back scoring form outputs spreadsheet with full text comments to each working group. (4 - 8 Nov)
  3. Working groups (with Committee review coordinator and Siko's help as needed) work in a group-specific Google doc to aggregate comments for posting on Meta, along with aggregated scores. (9 - 13 Nov)
  4. Committee review coordinator pastes aggregated scores and aggregated comments to each proposal on meta-wiki (14 Nov)
  5. Crat blocks recused committee members from iegcom-wiki and mailing list (14 Nov)
  6. Siko posts full text comments to IEG wiki (15 Nov)
  7. Committee (minus recused) discuss on iegcom-wiki and form the final shortlist recommendation (16 Nov - 24 Nov)
  8. Siko takes that shortlist and WMF completes due diligences and processes final grant approvals (25 Nov - 15 Dec)
--Pine 06:54, 19 October 2013 (UTC)Reply
I think that works, Pine. Having used Google Forms before, it's absolutely horrid, but I can't think of another way to make it work. Steven Zhang (talk) 23:17, 19 October 2013 (UTC)Reply
reading the above gives me a headache, but fine, I will await the said Google doc (with some trepidation). Would it be a good idea if we sent around our top 3 choices for funding with our reasons why via mail beforehand? Otherwise it's a pretty lonely decision process between us as individual interpreters of the grantee's words. You might take another view when you read the endorsement of someone else. Also, why do I see so few endorsements by committee members? Is this "not done" for some reason? Jane023 (talk) 10:00, 20 October 2013 (UTC)Reply

Sounds good, though i never used google forms before. --Ailura (talk) 18:18, 20 October 2013 (UTC)Reply

Apologies for not having written much in the past few weeks, I've recently moved and it's taking quite a while to settle down! One minor point with regards to a crat blocking recused members: it doesn't necessarily need to be a crat who does the blocking, any admin should also be able to temporarily block a user from the wiki (thus temporarily revoking their wiki access). Thehelpfulone 21:34, 20 October 2013 (UTC)Reply
We have only one non-crat admin and it's someone who will need to be blocked. They could block themselves I guess.--Pine 18:45, 22 October 2013 (UTC)Reply
I did work with google forms and it did not work very well some times - but I think we can use them as they are easy to handle. -- Achim Raschka (talk) 13:21, 21 October 2013 (UTC)Reply
OK great - I like these updates too, and am adding here just a couple more small tweaks to the timing (mostly about my bits because I'll be on an airplane most of the 15th headed home from Berlin). Agree google forms can be a bit painful but given the constraints still seem like best working option. So, let's go with this for the final schedule, and I'll update our various pages/docs accordingly:
  1. Working group members each submit scores individually using the Google form for their group (Siko to provide this form via email, as in last round) (23 Oct - 3 Nov)
  2. Siko anonymizes the results, shares back scoring form outputs spreadsheet with full text comments to each working group. (4 - 7 Nov)
  3. Working groups (with Committee review coordinator and Siko's help as needed) work in a group-specific Google doc to aggregate comments for posting on Meta, along with aggregated scores. (8 - 13 Nov)
  4. Crat blocks recused committee members from iegcom-wiki and mailing list (end of day 13 Nov)
  5. Committee review coordinator pastes aggregated scores and aggregated comments to each proposal on meta-wiki (14 Nov)
  6. Siko posts scoring outcomes to iegcom-wiki (I'll likely post the ranked list of proposals that came out of your scores, and make sure that all members have access to all full text comments, but not spend hours re-posting each of those comments to the wiki) (14-15 Nov)
  7. Committee (minus recused) discuss on iegcom-wiki and form the final shortlist recommendation (16 Nov - 24 Nov)
  8. Siko takes that shortlist and WMF completes due diligences and processes final grant approvals (25 Nov - 15 Dec)

Siko (WMF) (talk) 20:46, 22 October 2013 (UTC)Reply

Reorganizing committee pages on meta

[edit]

Pine suggested that scoring would be easier if we reorganized some of the committee's materials that have grown over time, to make things feel tidier and easier to gather everything a member needs all in one place. Can committee members provide some more specifics here? Any insights you can add about things/pages/info that feel particularly scattered or difficult for you to find as you're going about your workflow would be helpful. Thanks! Siko (WMF) (talk) 20:39, 24 January 2014 (UTC)Reply

  • I think we could combine or archive some of the discussions and discussion pages like Grants talk:IEG/Committee and Grants talk:IEG/Committee/Workroom. Also I think we have 2 or 3 places discussing Committee procedures that could be combined and streamlined. I think looking at a map or flow chart of all IEGCom pages would be a good way to find pages to combine and prune. Also I think we should have a neatly organized archive of proposals from previous rounds. --Pine 06:32, 25 January 2014 (UTC)Reply
Hi Pine! We've started reorganizing the committee pages a bit - as you can see, the workroom has now been updated to bring things like the review subpage into the main workroom page, making info a bit less scattered. Next on the to-do list is making a committee navigation template that will collect all the committee links to things that can't live on the workroom page into one handy spot where you can jump to everything else you should ever need - we'll add that navigation to the workroom soon! Feedback on these workroom changes are, as always, most welcome :)
In terms of the discussion pages, my suggestion is to keep this page as the active committee's sole discussion page. Pages like Grants_talk:IEG/Committee currently redirect to the main IEG questions page, and to me that makes sense because that main committee page is very public-facing....we expect prospective committee members and other curious folks to show up there, and routing them to the main IEG questions page may be more useful than routing them to the workroom page where you current members are trying to actively coordinate your own work. But, if you feel strongly that you'd like the redirect for that page to come here instead, I'm fine with this instead....it just means we'll have more of this discussion mixed with random passers by (which could be a good or bad thing, depending how you look at it!). And if you spot another talk page that needs a redirect, please be bold :)
Because proposals from previous rounds can be updated for new rounds, the categories that we use to produce lists in any given round can change, making archives not super easy to generate on meta at this point. I track what we reviewed each round in my grantmaking spreadsheets to meet our staff needs. Lists of proposals you review in any given round are also stored in IEG-com wiki, though, not using categories...that might be an easy place for someone from the committee to take a pass at organizing a neat archive, if the committee feels a need for this? Cheers! Siko (WMF) (talk) 23:26, 6 March 2014 (UTC)Reply

Criteria updates for Round 3

[edit]

Given the discussions at Grants talk:IEG/Learning/Round 1 2013/Impact and Grants:IEG/Learning/Round 1 2013/Impact#IEG_ Selection Criteria does anyone have suggestions for changes to the scoring criteria for Round 3? For example, should we add new criteria or give more weight to some criteria by assigning them more points? --Pine 06:50, 14 April 2014 (UTC)Reply

I see scoring criterias but is criteria's weight published anywhere? Or is it not public? rubin16 (talk) 07:02, 14 April 2014 (UTC)Reply
Criteria has not been weighted in the past, to keep things simple, so that's why there's nothing published. If folks feel somethings should be given more weight, please list them here? Siko (WMF) (talk) 16:57, 14 April 2014 (UTC)Reply
One big change I've been thinking to make this round: having people provide 3 scores rather than 12, to help simplify the process a bit (as requested from the last round). To do this, we could have you provide one score for each of the 3 main dimensions (the larger categories), rather than scoring each sub-point, and just include comments about specific concerns/joys about any of the subpoints as an input for us to look at further during due diligence and deliberations. I'm not sure that having so many detailed numerical scores has been very helpful in the past (it seemed to make things more work for both the committee and proposers to understand, without any clear benefits), so distilling them into the bigger picture of 3 main areas might be worth trying this round. If we don't like the experiment, we can always change back in the next round. Thoughts? Siko (WMF) (talk) 16:57, 14 April 2014 (UTC)Reply
One additional dimension I've considered adding, based on the impact report findings: Community engagement - we could move criteria E into this dimension, and add other subpoints about community engagement throughout the project's lifecycle as well (feedback & target audience, per impact report suggestions). Community engagement feels like it has been proven to be extra important in the IEG process by now, so making it stand out feels like it could be a useful change. By implementing this along with my suggestion above, there would be 4 dimensions, but still only 4 scores rather than 12+ (trying to save you guys some agony!). Thoughts? Siko (WMF) (talk) 16:57, 14 April 2014 (UTC)Reply
I've just sandboxed both of these changes - feedback/tweaks welcome. If there's no strong opposition to trying this experiment, I'll implement it this week. Cheers, Siko (WMF) (talk) 21:19, 15 April 2014 (UTC)Reply
Hi, I think trying to reduce the number of scores is good but making each score more complex is going to cause its own difficulties. I think the increased complexity will require about as much time as before although there will be more subjectivity as committee members think about how to weigh the larger number of issues for each score. I wish I had a good idea for how to handle this. We could delegate some of the scoring work to WMF such as having WMF be responsible for the evaluation of the costs. --Pine 01:20, 16 April 2014 (UTC)Reply
You're right to worry about increasing complexity, Pine - agreed that could cause more problems than it solves. So, I just tried simplifying down the considerations in the rubric into sets of key questions that any one of us would consider before making this grant, in hopes of making this simpler - thoughts? It is hard to remove any more criteria if the committee aims to provide a holistic recommendation, and what we learned from the impact report has only suggested additional criteria be added (but 14 scores would be moving in the wrong direction!). I don't know why scores would be any more subjective when considered at the big-picture level rather than broken down into 12 - either way, the reviewer's perspective and subjectivity is going to come into play. You may be right that ultimately either way is the same workload for the committee! That's probably true and something we may be able to live with given the current levels of proposals per working group. Still, I'd be interested in seeing if a new format might make the output of the assessment a bit simpler for proposers to understand, with a greater balance of useful and specific qualitative feedback to accompany the numbers. Siko (WMF) (talk) 06:12, 16 April 2014 (UTC)Reply
the link to Google form doesn't work for me rubin16 (talk) 06:45, 16 April 2014 (UTC)Reply
@Rubin16 and Sbouterse (WMF): Just quickly: (without the training "/") works for me. --whym (talk) 08:33, 16 April 2014 (UTC)Reply
Thanks, now it's clear! @Sbouterse (WMF):, when we use less criterias, it becomes more difficult to compare and differentiate proposals: for example, 1 project is slightly better in one of the components (for 0,5 point, for example, but not sufficient to change mark for the whole 1 point), but it would be more difficult to score now. Maybe we should switch from 5-point to 10-points evaluation range? rubin16 (talk) 08:37, 16 April 2014 (UTC)Reply
rubin16, I can see how 10 points would give you more range - updated now, see what you think? And here's a new version of the aggregated feedback given to proposers to consider also. Siko (WMF) (talk) 06:12, 17 April 2014 (UTC)Reply
seems OK for me rubin16 (talk) 06:36, 17 April 2014 (UTC)Reply
  • After taking another look, I still prefer the old format which I think provides more specificity by having each question be its own score. This may take longer for Committee members to complete but I feel it is clearer, less complex, less subjective, and more meaningful for evaluation and transparency. I like the idea of simplifying the scoring process and would be interested in alternative ways of doing that without sacrificing clarity and specificity. Sorry Siko, I think this was worth trying, but I think the more specific scoring matrix has a lot of advantages even though it is labor-intensive. --Pine 07:13, 17 April 2014 (UTC)Reply
Thanks for sharing your input, Pine! I'd like to see some more people's thoughts here as well though before deciding on a course of action - we do need to make some updates based on what's been learned from the past, so either way some changes do need to be made, and settled pretty soon so that I can get everything ready for you guys to score starting on Monday. I do want to emphasize that this isn't only about the committee's time (though of course we don't want to add work for you guys), so I would encourage folks to think about the broader context of the proposer's experience as well when they weigh in.
Meanwhile, Pine, I wonder if you can point me to an example to help me better understand where you've seen the older version working so well in past rounds. Looking at the scores from the last round, I'm not seeing the kind of clear & meaningful variation you're mentioning - see, for example, the aggregated scores from tools proposals - I wish we saw more nuanced variation between each item, but it looks like there's mostly just a half-point between most of the scores in any given category, which is a lot of small variation for proposers to absorb without necessarily telling them something that qualitative comments couldn't accomplish better. Can you point me towards what you're looking at that's different? Do you think there is such a large risk with trying out the new method that would greatly outweigh the benefit of what we'd learn by experimenting with the change for 1 round? (I'd be ok with running an experiment that ultimately proved your hypothesis right and mine wrong, by the way!) Siko (WMF) (talk) 20:26, 17 April 2014 (UTC)Reply

I see Pine's point--going from 12 scores to 4 seems like we then have to do the mental math of weighing sub-elements and that makes it seem more arbitrary or vague. My background in education, however, informs me that overspecification of assessment can be just as arbitrary and give a false sense of precision. So, I rather like this new very clean framework which although less separated by element feels more intuitive and holistic to me. It's also more approachable for the proposers and I like grantmaking that makes sense to the grantees, a lot. I think it's worth a shot this round, and my hunch is that the new committee members will find it reasonable and old members will get used to it after a few ratings. Ocaasi (talk) 20:50, 17 April 2014 (UTC)Reply

  • I think our overall approach in the previous two rounds has proven very successful and I am reluctant to make major changes to a successful project. I think it makes sense to make modifications based on the feedback from WMF Evaluation but that doesn't extend to a complete redesign of the scoring tool, which I take to mean that we don't need to, and shouldn't, make major modifications to a major component of what is a generally very successful IEG program. I would also like us to remember that the scores and averages aren't our primary way of ranking recommendations. We mainly rank recommendations by the number of yes or no "votes" to recommend a proposal. Evaluation didn't suggest changing this method and I think the overall success of the IEG program suggests that our current approach works. The scoring tool that we have now is a good system for helping evaluators to think about a variety of criteria as they consider whether to recommend a proposal. All that being said, I don't think it would be the end of the world to experiment with this new scoring tool as long as we keep the same ranking system that we used in the previous two rounds. --Pine 08:02, 18 April 2014 (UTC)Reply
Thanks, everyone, for sharing your input! Summarizing some last thoughts on the matter and next steps...
  • WMF's Learning & Evaluation unit's report studied outcomes rather than process - so making recommendations on process changes was considered specifically out of scope. However, when I met with them to discuss their report, we dug into process a bit and this change to scoring at a level up was actually something they suggested. I am sorry I didn't explain that sooner - I'd completely forgotten that happened off-wiki :)
  • The pilot round feedback from proposers on the current assessment system does indicate there is room to improve in terms of their experience with the scoring system. Committee satisfaction in this area was overall higher than proposer's satisfaction. (as a side note, one of the really challenging parts of my job is trying to balance needs/inputs from multiple perspectives in our grantmaking ecosystem - I so appreciate your patience as we continually fine-tune for this balance!)
  • Pine is so right to note that what really matters for the committee decision-making process is the yes/no, and we will keep that the same for this round, as requested :) As such, an experiment which aims to give proposers a more useful assessment without making a change to the yes/no question appears to present no big risk for this committee.
  • Tallying and posting 4 scores per proposal rather than 12 is going to save Harold and myself a lot of time when we aggregate the scores and post feedback templates. As Harold's time is particularly precious, and this step is usually a bit rushed and exhausting for us both, we think this could be a nice side-benefit of trying something new.
As it seems the committee is willing to try something new, let's run the following experiment:
  • We'll deploy the new rubric and scoring form for round 1 2014.
  • We'll keep the ranking system the same as last round.
  • We'll deploy the post-review survey to proposers and committee members as we have in past rounds, to help measure outcomes of the experiment. I will continue to work closely with our Learning & Evaluation team to make sure this happens.
  • Measures of success we'll look at in the survey include:
    • committee's sense of workload/time spent should not increase
    • committee's satisfaction w/ review process should not decrease
    • proposer's satisfaction w/ review process should increase
  • Based on measuring the outcomes, we can evaluate whether to keep the updated version in future rounds, or return to the past.
Best wishes, Siko (WMF) (talk) 18:43, 18 April 2014 (UTC)Reply
Sounds OK to me. Thanks for specifying those measures of success. --Pine 05:48, 19 April 2014 (UTC)Reply
Would it be appropriate to notify the current proposers of this change? Or would it be too spammy/confusing? While the change is subtle and more about how the committee work, I think proposers might want to fine-tune their proposals after knowing the change (especially the community focus being more clearly expressed), and it would be good to remind them of the purposes and priorities of the program, in general. whym (talk) 04:51, 20 April 2014 (UTC)Reply
Hi whym, good point. It think because we're now already scoring, it might be more confusing than helpful (I wish we'd wrapped this discussion up earlier, and had time to so more along these lines last week - sorry!). I don't think we want to kickoff a big flurry of edited proposals while you're in the middle of scoring. If they haven't engaged with the community at all until now, yet another nudge is not likely to help anyway - I updated the selection criteria listed on the main IEG page, and the proposal forms this round all focused on community engagement in more than 1 place from the very start, plus many proposals missing notification or other community engagement did already get a nudge/question during the discussion period on their talk page. If you think a more systematic notification is needed, though, please feel free! Otherwise I suppose they'll be reminded of purpose & priorities for IEG again in 2 weeks once we post the aggregated feedback? Siko (WMF) (talk) 22:42, 21 April 2014 (UTC)Reply

Recusals for members who advise or mentor proposals

[edit]

Our current Conflict of Interest policy includes "Committee members and WMF grantmaking staff who take on a role of advisor to the point of becoming a champion for any given proposal should recuse themselves from the process of reviewing or giving final approval on the proposal." I propose to change this to "Committee members and WMF grantmaking staff who publicly or privately advocate for or against a proposal should recuse from scoring or giving final approval to the involved proposals if their participation could create an impression of bias that compromises their ability to evaluate objectively. Acting in a mentoring or advisory capacity that does not extend to an impression of advocacy or bias does not require a full recusal from the scoring process, but committee members who mentor or advise a proposal should not score that proposal during the first part of the evaluation process and they should actively disclose their involvement in any later stages of the evaluation process."

Ideally I think committee members and grantmaking employees would never mentor specific projects but I am hearing that demand for mentors is higher than the available supply. I am not sure that my proposal has the right balance of allowing for mentoring while trying to minimize bias in the scoring process and would appreciate any suggestions for improvement. --Pine 01:45, 16 April 2014 (UTC)Reply

Leaving from the committee

[edit]

This is just a quick note that I would like to be removed from the IEG committee, because I expect to have insufficient volunteer time for the role in the coming months. I'd like to take the opportunity to thank everyone involved for what has been a great collaboration - I have enjoyed the discussions and reviews a lot. I'll still be around, though. You might see me occasionally participating in IEG's public discussions. @Thehelpfulone, Hahc21, and Siko (WMF): Could you please make the necessary changes on the committee wiki and mailing list? whym (talk) 12:30, 13 February 2015 (UTC)Reply