Great idea to help make it much clearer for those that have great ideas or ambitions to have the opportunity to more easily apply for funding to fulfill those ambitions.
Use of time and resource - how much time and resource would a user have to commit to get a grant filed and approved? As low as possible, ideally.
From August 17 to September 7, 2015, Community Resources held a consultation about an idea for reimagining WMF grants. Based on what we heard from over 200 people during this consultation, WMF plans to move forward with a grants restructure, with the following modifications:
Offer Conference Support and Travel Support instead of the broad "Events" concept. We'll continue to offer the current Travel and Participation Support program together with Wikimania Scholarships, and add Conference Support to provide targeted resources for conference organizers. Project Grants will continue to fund other offline activities, which may include events other than conferences.
Meet the need for speed, simplicity, and flexibility with low-risk, low-cost Rapid Grants. We're developing the microfunds concept into Rapid Grants, which will offer quick funding decisions for requests or experiments that don't need extensive community deliberation. Rapid Grants will be offered throughout the year, to allow for more flexibility.
Keep Project Grants simple. Project Grants will keep the idea of a pipeline for scaling experiments into programs, but we'll make sure applicants don't have to distinguish project stages themselves. Applications will be accepted quarterly, but we'll keep other program requirements flexible and make a simple process for renewals.
Pilot Simple Process Annual Plan Grants, with more flexible support. We will pilot the simple process for Annual Plan Grants without a staffing limit, and make a process for requesting increases up to the funding limit throughout the year.
Focus on support. As we implement changes, we will focus on increasing support specifically for applicants. We will focus on improving the resources respondents have identified as high priority: connections to others, online resources in general, and budget guidelines. We plan to target support for grantees to specific needs and topics.
Priorities. Respondents overall prioritize achieving impact and simplicity and speed in the application process as even more important than community participation. Respondents' priorities also align well with the design principles outlined in the original idea of clear pathways, impact, and simplicity. (see Aspects of the grants process)
Support. Non-monetary support is important to respondents, and so is receiving the right kind of support. Connections, budget guidelines, and online resources are seen as most important, and the need for more targeted support is also emphasized. Respondents also described the need for better support during the application process and better tools and support for committees who are making decisions. Finally, collecting global metrics is a difficult part of the process that needs more support. (see Resources and information and Aspects of the grants process)
Endorsements and strengths. Overall, many respondents endorse the idea, and like the new funding types and the ways funding could be used. For example, participants already like the way travel support works, like the idea of a simple process for annual plan grants, and liked the idea of a pipeline for experimenting and growing projects. Many respondents find the distinctions among grant types easy to understand and some think the process could reduce the time and work required. (see Strengths and concerns)
Concerns and suggestions. Despite many overall endorsements, most respondents express both big and small concerns about the new structure, or have suggestions about how the idea could be improved. For some respondents, distinctions among grant types are not yet clear enough (e.g. seed vs. growth, projects vs. events, microfunds, research). Some are concerned that volunteers might spend too much time learning about the new structure. Some worry that rigid application cycles may not give volunteers enough flexibility to pursue opportunities throughout the year or when limited planning is required, and that limits on staff or funding could make some grant types less effective. (see Strengths and concerns and Suggestions)
Based on the feedback summarized below, we'll move forward with a new structure for grants as follows:
Rapid Grants. To provide quick support for opportunities throughout the year. Up to $2000 for low-risk experiments and standard needs (meetups, etc) that don't need broad review to get started.
Project Grants. To promote experiments and sustain ideas that work. Up to $100,000 for 12 months. There will be different guidelines and support systems for experiments and established projects, but one application process.
Annual Plan Grants. To support organizations in developing and sustaining effective programs. Up to $100,000 for 12 months through a simple process, and full process for larger or unrestricted grants.
Conference and Travel Support. To support organizers and travelers attending conferences. Travel, kits and guidance, funds and merchandise, to foster community connections and learning.
We heard about a need to quickly start small low-risk experiments or get small amounts of money or merchandise for well-understood needs throughout the year. The ability to apply throughout the year gives applicants more options for pursuing opportunities as they arise or doing activities that require less advance planning. Respondents rate their Travel and Participation Support experience as above average or excellent, so we can learn what works about that fast review model to create a process for Rapid Grants.
Concerns mitigated: grants without quarterly application cycles are needed to take advantage of opportunities as they come up.
Strengths enhanced: space for experimentation is valued, the Travel and Participation Support model of review works well.
We increased the funding limit for these grants to $2000, and will set up a quick staff-led review process, and simple reporting. This will also spare committee time so that volunteers can focus on higher-risk grants that require community expertise and discussion.
Priorities reassessed: speed and simplicity are more important than community review for some types of grants, committee expertise needs to be honed.
Rapid grants can cover expenses for either small projects or small events, so we've decoupled them from events.
Concerns mitigated: distinctions between project and event grant types may be confusing.
Suggestions implemented: make the smallest grants easier to get by eliminating confusing distinctions.
As part of piloting this kind of grant, we will be sure to coordinate with local organizations that also give microgrants and learn from these models.
Suggestions implemented: emphasize local grants, look at local models for grants.
Rapid Grants and Project Grants will not have separate entry points, to make things simpler for applicants.
Suggestions implemented: reduce the time and work required of volunteers by simplifying entry points.
We heard from some that the seed and growth distinction is unclear, while others valued this approach to having a pipeline for how projects develop. We'd like to have the best of both, so while we are including guidelines for how to best steer projects from experimentation to scale, we're not going to have two different application processes.
Concerns mitigated: distinctions between seed and growth options may be confusing.
Strengths maintained: pipeline for growing projects from experiment to scale.
Many participants noted that grants that are considered projects can often include events. We will focus on distinguishing only Conference Support and Travel Support from Project Grants, and we certainly won't require a project that also has travel needs to apply for each part separately. Staff can connect these on the back end. Annual Plan Grantees will have the option to apply for Conference Support separately or include conferences in their plans.
Concerns mitigated: distinctions between project and event grant types may be confusing.
The funding limit for Project Grants will be $100K. Based on past grants, we'll set expectations that first-time grantees or first-time experiments should be funded under $30K.
Concerns mitigated: $30K funding limit may not make sense for such a wide range of projects, distinction between seed and growth options may be confusing.
There will be four application cycles each year, since having cycles for reviewing larger projects is beneficial for committee participation. Committee will be divided into working groups by topic and offered more focused support.
Strengths maintained: better supported process for community decision making.
Staff will sort applications by topic so that applicants get customized support and review. Some topics we can think of are research, software development, photo contests. This will also help us hone committee expertise, to make sure committee members are focused on the grants they are most interested in and qualified to advise about.
Suggestions implemented: better applicant support, topical support for grantees.
Concerns mitigated: it's not clear where research fits.
We've heard concerns that a staffing limit for simple process Annual Plan Grants may prevent organizations from seeking the best funding type for them, or growing at the pace they need. So, we will implement the simple annual plans pilot without a limit on staff. We are keeping the $100K limit on funds.
Concerns mitigated: a limit on staff may prevent organizations from growing in a healthy way.
Suggestions implemented: focus should be on support rather than artificial limits.
Strengths maintained: we need to tie funding options to appropriate levels of risk.
There will be way to add on funds throughout the year to support new opportunities up to the funding limit, to give smaller organizations more flexibility. Also, full process and simple process Annual Plan Grants can both apply for conference grants if they don't want to include those in their annual plans. For those that prefer a single application process, though, all of an organizations' projects, conferences and travel can be included in one annual plan.
Concerns mitigated: grantees with annual plans may need the flexibility to get funding for opportunities throughout the year.
Suggestions implemented: we can better track impact when an organization's activities are tied together as part of the same grant.
This won't be implemented right away, but we will be thinking about how to set up a simple annual renewal process for organizations that don't need to grow their budgets or staff, to increase stability and reduce administrative overhead for smaller organizations.
Strengths enhanced: a simple process for applicants.
We will start making changes to grants with a pilot for the Simple Annual Plan Grants that will be implemented immediately, once this consultation closes.
Concerns mitigated: respondents are currently the least satisfied with Annual Plan Grants overall, and find it the most difficult grants program.
Changes to the full process Annual Plan Grants will be considered after the first round of applications from the simple pilot, because we want to learn from the pilot before making any changes for 2016-2017 Round 1.
Planned focus: simplify processes, clarify links between funding options and type of support needed.
We understand that non-monetary support for organizations is a priority, and so we are working to develop more resources to support organizations of all shapes and sizes.
Priorities reassessed: emphasis on topical non-monetary support.
Travel and Participation Support (TPS) and Wikimania Scholarships are working well, and participants in these programs are generally satisfied. Under the new structure, we are not proposing any big changes to these programs or workflows.
Suggestions implemented: no major changes to programs that are already working well.
Conference support will be about more than just dollar amounts. For conference organizers, we will be offering standard kits, guidelines, and a more consistent application and reporting process. We want this process to streamline the way we do conferences in our movement, and we want to provide conference organizers with the resources they need to achieve more impact through their conferences.
Strengths enhanced: emphasis on topical non-monetary support, emphasis on impact.
We heard that some respondents did not find the distinction between projects and events clear enough, and so we are no longer using a broad "Events" grant type. We will focus only on offering more structured support for conference organizers and travel.
Concerns mitigated: distinctions between project and event grant types are not clear.
We know that it is important to minimize disruption to communities and grantees as we implement changes, so we'll start by piloting new approaches in some areas while maintaining existing systems in the other areas. Eventually, we will move all grants over to the new structure. Most changes won't be noticeable until later in 2016.
Timeline for changes to WMF grants
1 October 2015
Open applications for Simple Process Annual Plan Grants pilot (applications due 1 November for grants starting 1 January)*
Preliminary evaluation of Simple Process Annual Plan Grants based on first application phase
March - June 2016
Finalize changes to Full Process Annual Plan Grants based on simple process pilot and consultation feedback.
Implement changes to Full Process Annual Plan Grants for round 1 2016/2017 applicants
Transition Individual Engagement Grants + Project and Event Grants to Project Grants and Rapid Grants
Evaluate Simple Process Annual Plan Grants pilot with data from first round of grant reports.
* Applications can be made on a rolling basis throughout the 2016 pilot.
We gathered feedback from respondents through three channels:
IdeaLab. 34 people shared their thoughts on the Idea discussion page, and through endorsements on the idea page.
Survey. 198 people shared their thoughts in a survey that included multiple choice questions as well as open ended questions about the Idea.
Conversations. 13 people sent us thoughts by email or voice.
Having multiple channels for this consultation, including alternatives to a public discussion page, was very useful. The survey allowed us to gather qualitative feedback from a diverse range of contributors, including participants from emerging communities. The survey also allowed us to collect specific information about how respondents prioritize or experience different aspects of the grants process, and may allow us to compare responses over time. We also performed a historical analysis to understand how a new program structure would have affected grants awarded in WMF fiscal year 2013-14. This new structure was applied to all grants programs, except the Annual Plan Grants program (i.e. Project and Event Grants, Individual Engagement Grants, Travel and Participation Support, and Wikimania Scholarships). The final results from this historical analysis are summarized as part of the Non-APG Grants Impact Analysis for Fiscal Year 2013-14.
Learn more about responses, demographics, and analysis
Users who have been active in the grants namespaces in the past two years were invited to take the survey, so that we could benefit from their past experience. We contacted these users by messaging them on-wiki and by emailing current committees and grantees.
We also sent mass messages to different village pumps inviting feedback across projects and languages on the Idea discussion page.
Several survey questions ask respondents to provide rankings. In these cases, we've tallied points based on the number of times each item was highly ranked, and determined rankings based on these points. To simplify these rankings for readers, we've included them as rankings on a scale from one to ten, with one being ranked as the most important.
There are two approaches to understanding likert-scale-type questions that were included in the survey. For questions we wanted to rank, we've given each item a score based on the mean by assigning each point on the likert-type scale a numerical value from one to five, with five being the most positive ranking. For other questions, we've simplified each five-point scale across three categories to show the percent of respondents that indicated a score in each of three ranges (e.g. negative, neutral, positive).
We qualitatively coded responses from all response channels (survey, IdeaLab, etc) together in in order to identify the most relevant types of information as well as themes or topics that were frequently brought up.
We used codes to
group quotes about the most frequently identified themes, and then look at how many times each theme came up.
identify specific strengths, specific concerns, suggestions, and needs, and figure out how these different types of feedback line up with the themes.
identify overall endorsements and overall rejections in order to track how negative or positive general responses to the idea were.
In this report, we use some numbers to identify how often different themes and types of feedback were found. We do think the numbers are useful to help us understand the aggregated feedback at a high level, but we've used language like "about" to describe frequency since qualitative coding is not an exact science.
We had two people coding the responses to get more consistency in how the codes were applied, and to catch inaccuracies.
When looking at the discussion page feedback, we grouped comments that came from the same user as one response. We also included endorsements on the Idea page with these responses.
We tried our best to limit coding within each comment by not coding related or redundant items. If a similar or same point was brought up on the discussion page several times, it may have been coded only once.
We realize there are some duplicate responses in the survey and on the discussion page, since a few people participated using multiple methods; however, since the survey is anonymous we aren't able to identify which responses are likely to be redundant.
Not all 198 survey responses counted were complete, and not all included direct feedback about the idea.
Of the 245 total responses, we received 118 responses that included direct feedback about the idea.
All Wikimedia projects are represented by at least one respondent, though 51% of survey respondents identify Wikipedia as their primary project.
69% identify as male, while 21% identify as female, 2% identify as other, and 7% prefer not to say.
Survey respondents identify with 34 Wikimedia language projects.
Survey respondents are located in 101 countries, 61% in the global north and 39% in the global south. Of the 118 responses to the idea, 60% are from the global north, 20% are from the global south, 20% are not identified.
The majority of survey responses (35%) identify most with the Project and Event Grants program, while 19% of respondents each identify with Annual Plan Grants, Wikimedia Scholarships, and Individual Engagement grants. 7% of respondents identify with travel and participation support.
Overall, the idea received many more overall endorsements than overall rejections. We found about 40 general endorsements and 6 general rejections, but many people who endorsed the idea overall also had concerns about the idea and suggestions to offer.
Respondents identify about an equal number of specific strengths and concerns about the idea, and often disagree on key issues. We identified about 142 specific strengths, and about 139 specific concerns.
Respondents offer more than 100 suggestions about how to improve grants or improve the idea.
It is important to us to look at what specific strengths and concerns about the idea were identified. We identified some of the most frequently mentioned strengths and concerns, which are summarized in this table below.
The distinctions between grant types and options make things more clear. We found this strength about 41 times.
In general I find that the three groups of grants have names which correspond to the kind of supports the community/affiliates are seeking.
The distinctions between grant types and options are not clear. We found this concern identified about 30 times.
Branding in this proposal is "three types of grants", but the kinds of projects including in those three categories do not naturally go together, and the kinds of funding offers in each of these categories are not intuitive.
The process will be simpler. This will make things easier for applicants and grantees. We found this strength about 39 times.
The new process separates out the more complex grants seeking restricted annual grants from the less complicated project and event type grants.
Too much time and work required will be required from volunteers to learn this idea. We found this concern identified about 31 times.
While I like the grant types and differentiation, I think that it is an inefficient use of community volunteer time to learn about these different grants.
We like the types of funding offered and the ways funding can be used. We found this strength about 29 times.
Now has an option for small level grants for low-cost events and projects, rather than having to go through the full process for very small amounts of money.
We are not satisfied with the types of funding offered and the ways funding can be used. We found this concern identified about 20 times.
The new structure doesn't address, or worse, might make things tough for an organization in transition [such as from] 1 FTE to more employees.
We also identified some strengths and concerns that were less frequent, but still came up quite often. Concerns include complexity, inflexibility, concerns that the idea would not appropriately address risk. Some other strengths include around how the idea emphasizes impact, and how the idea would save volunteers time and work.
The most frequently offered suggestions focus on improving applicant support, with about 13 suggestions in this area.
Imagine a clinic with three doctors inside. You enter reception and they direct you to the door of the doctor you need to see. If each doctor has his own door directly accessible from outside some people will knock at the wrong door. Thus the challenge will not only be a kind welcome, but also a helpful redirection which is not mentioned.
We tracked what respondents are saying about what they need, both to make grants better and to do their work on the Wikimedia projects. This is an important area to track as we move forward with implementing changes. We read about 21 comments about specific needs.
Summary of needs described during the consultation
A number of respondents identify alternative structures to the idea originally proposed. We received about 14 suggestions for alternative structures, which are summarized here.
Ideas for alternative grant structures
Instead of proposing all of these types of grants, have one application space that says "propose anything for any reason". (...) After someone writes a proposal, paid staff should apply the esoteric internal labels which this proposal is designing. The person submitting the proposal does not even need to know this is happening.
55% rank the grants experience as above average or excellent.
Travel and Participation Support receives the highest satisfaction ratings overall, with about 63% of respondents describing their overall experience as above average or excellent.
People with experience with Annual Plan Grants are least satisfied, with only about 38% describing their experience as above average or excellent.
51% of respondents find the grants process to be easy, while 23% found it difficult.
Easiest aspect of the process: Doing the paperwork to get funds once the grant is approved.
Most difficult aspect of the process: Making an application and collecting global metrics for a grant report.
Achieving impact, and speed and simplicity in the application process, are ranked as the highest priorities. These priorities are ranked more highly than community participation.
Connections to others, financial guidelines, and online program resources are ranked as the most important forms of non-monetary support, and satisfaction ratings indicate there's room to improve in meeting these needs.
Across all programs, 55% of respondents describe their experience as above average or excellent, while 21% described their experience as below average or very poor. It is important to note that survey respondents included people whose grant applications did not receive funding.
Travel and Participation Support is working well. In line with our qualitative findings about the idea, survey respondents rank their satisfaction with the Travel and Participation Support program as above average (31.5%) or excellent (31.5%). Travel and Participation Support received the highest satisfaction ratings overall, with about 63% of respondents describing their overall experience as above average or excellent.
People with experience with Annual Plan Grants are less satisfied, with only about 38% describing their experience as above average or excellent. 36% described their experience as average, and 26% described their experience as below average or very poor.
Respondents are able to get the information they need about grants, but need more timely and useful feedback about grant proposals and reports.
Overall, respondents find the grants process more easy than difficult, with about 51% of respondents finding the process easy overall and about 23% of respondents finding the process difficult overall. Respondents find the administrative side of grants particularly easy, and share positive comments about their interactions with WMF staff around administering the grant and requesting changes.
Once I actually got the grant, everything was quite straightforward in administering it,
I had the impression I was entering a well-oiled machine.
To tell the truth there is nothing difficult in PEGs or APGs if you know your programs and know what you want and what you want to achieve. Even the language barrier can't be an obstacle in this case. If your program is well thought out you will always get fund for it. If something misses in your PEG application or there is another problem the WMF staff or other volunteers who participate in the discussions always ask to provide the missing information or give an advise how to correct it and at the end your PEG is again approved. I want to state again that everything depends on the program, why it is needed, what impact it will bring to the movement, etc.
Respondents find some aspects of the application and reporting process difficult. When asked what could be improved about the grants processes, several respondents also brought up issues around communication with WMF or the grants committees. Respondents are concerned about the timeliness and quality of feedback, and also feel that getting timely feedback was important to being more successful with grants.
I think the most difficult in the grant program is the feedback process. There are several expectations that a requester may not be able to provide as they have different backgrounds and the Wikimedia movement is not as extensive as they are in other places. There has to be an understanding of the movement around the world. Everyone seem to be grouped together with the same expectations and same output.
Grant reviewers have gone through dozens of proposals and thus have a very sharp opinion on the usage of funds and budgeting. In a way, the process resembles internal budgeting within a company more than an open grant program created to fund interesting ideas. This makes it very difficult to succeed in the grant process without prior experience with the expert opinions of the WMF staff. Funds are therefore only available to a narrow group of applicants who know "what to do and how" - a group I have gradually become a member of. Finding a solution to this is very hard.
Ease and difficulty of specific aspects of the grants process
This bar chart highlights how easy or difficult respondents find specific aspects of the grants process.
Easiest aspects: (1) Receiving funds, (2) Grant agreement, (3) Eligibility
Most difficult aspects: (1) Collecting global metrics, (2) Applying, (3) Reporting
Respondents prioritize (1) Simplicity in the application process, (2) Knowing if you are able to apply (eligible) for a grant, (3) Preparing your application. In the chart below, you can see the top three priorities highlighted in orange. You can see that community participation, partnering with local groups, and leadership development, were not emphasized by respondents overall.
Participants ranked (1) Connections to others, (2) Budget or financial guidelines, (3) Online Program Resources; as the most important non-monetary resources, while they were most satisfied with (1) Specific Feedback or Coaching, (2) Mentorship from WMF Staff, (3) Logistical support for projects. The chart below highlights the top three resources ranked as most important in orange, while the top three resources that respondents are most satisfied with are shown in green. You can see that these two groups do not overlap, which indicates that the resources ranked as most important by respondents are not the resources they are most satisfied with.
Participants want to get information through Pages in the grants namespace, Emails from staff, and Public mailing lists. Participants highlight in the qualitative feedback that high priority information is particularly useful in the form of Emails from staff, while other methods (e.g. private mailing lists) are less effective. This chart shows how different ways of sharing information are ranked by survey participants.
100% of respondents who participated in the Annual Plan Grants find general support at least somewhat important. Most find general support funding was very important: 20% find it somewhat important, while 80% find it very important or extremely important. In general, respondents report that receiving general support funding has allowed them to better strategize over a longer period of time, respond flexibly to opportunities and needs, and improve the effectiveness of their programs as they learn. Another benefit is more flexibility to build organizational capacity through grants.
It has been very important for us in the past for our long term capacity building and differentiation of our programs and activities.
The biggest issue for us is course correction (changing programs as we go, depending on the circumstances that often cannot be anticipated over a year in advance) and the fact that we can do everything in one batch, instead of per-project.