Grants:APG/FDC portal/Feedback and continuous improvement of the FDC process/Process Survey/2013-14 Round 1

From Meta, a Wikimedia project coordination wiki

Overview[edit]

This is the summary of the survey results from the participants of the FDC process, 2013-14 Round 1. This includes both the Process Survey (N=19) and the Cost-Benefit Survey (N=6).

Process survey results from Round 1 2013-14 Annual Plan Grants.

Context of 2013-14 Round 1[edit]

  • This round had 11 applicants, 3 of whom were participating the FDC process for the first time.
  • This round had 2 more compliant applications than 2012-13 R1.
  • The FDC awarded funds to all 11 entities, with amounts ranging from $53,000 to $1,750,000.
  • 88% of entities received 50% or more of requested funding.
  • On average, entities received 75% of requested funds, with a range of 30% to 94% and median of 80%.
  • Entities applying for their second year of funding were granted an average 17% budget increase.

Executive Summary[edit]

  • 19 people responded to the process survey: 11 Entities/2 FDC Members/ 3 FDC or WMF Staff/ 1 WMF Board/ 2 FDC Advisory Committee
  • Entities are spending less time on the application process, and are less satisfied with the process overall. FDC and WMF staff report spending more time on the FDC process than in the past and report being more satisfied with the process.
  • Most returning entities seemed to have missed or disregarded information about guardrails on funding increases.
  • Entities are struggling less with the portal/process than in the past, but finding more challenges with developing SMART goals and measuring impact.
  • The FDC process is perceived by many entities to be more useful for institutionalization, and less effective for innovation.
  • Communication was a major theme:
    • Several entities felt their context was not understood by the FDC.
    • There is a desire for more communication with staff in terms of capacity building and managing expectations.
    • While community comments are generally well received/desirable during the proposal review period, they can be problematic when coming from those unfamiliar with the FDC process.

Recommendations[edit]

  • Conduct pre-application capacity building for applicants, with a focus on aligning SMART goals with innovation, program development and budget planning.[1]
  • Give entities an opportunity to discuss their application directly with the FDC, either in community comments or by video call during deliberations.
  • Enable greater community engagement by simplifying language in the application so that it can be translated into local languages.
  • Research where innovation is happening within FDC-funded groups, and how/if it is working

Cost-benefit survey results[edit]

We heard from 6 of the 11 entities (55%). For full results, see full survey results.

Interesting points:

  • Time: Average (self-reported) time per applicant = 92 hours
  • $ Cost: One applicant spent $950 on the application process; the rest spent $0 on outside help
  • Pain Points: There could be too much repetition in the process (Q&A repeats questions answered elsewhere)
  • Staff Assessments: 5 of 6 respondents thought receiving the staff assessment beforehand was helpful (Note: this was a process change in this round; instead of immediately posting the assessment publicly, WMF FDC staff sent the proposed publications to entities the day before the public posting.)
  • Q&A: many wanted more time for the Q&A before the staff assessment comes out.

Notes[edit]

  1. Note, this was piloted in Round 2 2013-14, with a specific hangout dedicated to developing SMART goals prior to application deadlines. See recording and slides.