Appendix III: Evaluation Worksheet
The evaluation worksheet OTF staff and our Advisory Council use to evaluate proposals.
The below worksheet is used by both OTF staff and our Advisory Council to evaluate proposals.
- Does the proposal have a clear and well-justified problem statement?
- Does the proposal identify a specific group of people that are affected by that problem, and do they fall under OTF's congressional remit?
- Does the proposal cite a compelling case study or user problem?
- Does the proposal describe an approach that is realistic, and achievable, and responsive to the problem statement?
- Does the proposal describe how this approach was chosen and why it will succeed?
- Does the proposal identify and acknowledge what the challenges will be?
- Are there liabilities and risks to OTF from taking on this project?
- Does the proposal include what is currently being done in this field and the known limitations?
- Does the project show familiarity with the field and other people working in this space?
- Does the project build upon or complement other projects?
- Is the project duplicative of other projects? Do they provide a strong justification for that?
- Does the proposal break down the approach outlined above into a set of aligned objectives?
- Are the objectives provided specific, measurable and achievable?
- Are the objectives broken down into activities and deliverables with sufficient detail?
- Does the proposal clearly state what deliverables are expected at each stage of the project?
- Does the proposal include a clear distribution and promotion plan?
- Is the budget realistic and commensurate with both the project objectives and time frame?
- Is this project providing a payment schedule that is within their capacity to implement within a payment-on-delivery framework, i.e. no funds up-front?
- Does the proposal identify any cost sharing or matching support for the proposed effort?
- Does the project currently receive any U.S. government or other public funding?
Project Monitoring and Evaluation
- Does the project articulate a clear set of evaluation criteria and milestone metrics for activities, objectives, and deliverables?
- Will the approach yield quantitative and/or qualitative results?
- How difficult will an assessment of success or failure be? Does the proposing entity have the capacity to self-evaluate and extract “lessons learned"?
- Is the proposed effort able to be openly peer reviewed and/or include a peer review process?
- Do you have any feedback for the project?
- Do you think the project needs support in coming up with a monitoring and evaluation plan?
- Does the project currently have a diversified funding/support strategy?
- Does the proposed effort have access to other sources of direct or indirect support, such as community or other in-kind support that it already receives?
- Does the project seek to share resources or enable others to reuse the resources they develop?
- Do the objectives of this proposal contribute broadly to other Internet freedom projects and the larger internet freedom support?
- Is the project located in a sustainable host institution?
- Does the project have community support?
Institution's Record and Capacity
- Is the project team qualified to complete the proposed scope of work?
- Does the team have a history of successful work relevant to the proposed effort?
- Have team members worked with at-risk communities in the past?
- Does the proposing entity have a sufficient core team (leadership, developers, etc.) dedicated to this project?
- Are project team member(s) clearly identified, along with work experience, in the proposal including at least one person who will explicitly take on collaboration efforts?
This section is asking you to score and comment on specific proposal topics.
- Did the applicant explain the technical objectives of this effort clearly?
- Did the applicant provide any research/documentation of the feasibility of their technical approach?
- Did the applicant explain any hurdles or technical challenges they might encounter in implementation and how they will address them?
- Does the project identify potential unintended consequences?
- Does it identify how an adversary might use the solution to further their own goals?
- Are the chosen tactics easily overcome by the adversary?
- Does the proposal discuss how the project could be undermined, identify its own deficiencies and limitation, or does it presume there are none?
- Does the proposal explore short-, medium-, and long‐term strategies from the adversary’s point of view?
- Does the project increase or decrease known attack surfaces?
- Does the proposal consider potential illicit uses of the project?
- Does the proposal demonstrate clear need/demand for the proposed effort from the target audiences?
- Does the applicant consider usability and/or accessibility issues and challenges?
- Does the project have prior usability experience or if it's an established effort, does it have a clear usability practice?
- Does the applicant's research methodology seem coherent and well considered?
- Does it seem appropriate to the research questions articulated in the proposal?
- How will the applicant ensure that audiences are made aware of the research findings?
- To the degree that the research involves human participants or subjects, how does the research plan include them and account for their well-being?
- What steps does the applicant propose to mitigate any potential risks or harms that may result to participants or subjects?
- Does the applicant elaborate on the risks for the implementers and the participants in digital security activities and how they plan to mitigate them?
- Does the applicant have a clear understanding of the risks associated with the organization applying new security procedures and adopting new tools?
- Does the applicant provide detailed information on the proposed training (participants, number of training days, topics to be covered, etc.)?
- Do they factor in technical limitations (bandwidth, usability) and legal restrictions when designing interventions?
- Did the applicant provide a copy of the training schedule?
- Do the proposed activities, including the training, give tool developers or digital security researchers feedback, and have a clear plan on how their work would ultimately advance these tools and research?
- Does the proposal articulate efforts to include underrepresented voices in the community?
- Does the proposal identify a clear plan to cultivate community support around the proposed effort, including mechanisms to receive feedback and get others involved?
- Does the proposal discuss how the project could be undermined, identify potential risks for those involved?
- Does the proposal provide a security strategy to mitigate associated risks?
- Does the proposal include a code of conduct?
(space for recommendations)
Please list any feedback you have to the proposed effort, or concerning elements they need to address
(space for feedback and concerning elements)