Appendix A: Research Methodology

Tech Impact Idealware has developed a rigorous methodology governing the research for all of the reports in our Consumer Guide series. The following is an overview of the process.

The research for this report followed a five-step process:

Identify Inclusion Criteria

We began the process by reaching out to Subject Matter Experts (SMEs) who assist nonprofits with the selection of donor management software to solicit input on the criteria we use to select the systems for inclusion in this report. Based on their feedback, we developed the following criteria that a system must meet in order to be included in this edition of the Consumers Guide:

  • The system offers a cloud-based subscription option.
  • The system is intended for use by small organizations as their only database to manage online and offline fundraising activities, donors, and supporters.
  • An organization with three users and 1,000 constituent records could purchase it for less than $10,000 in the first year, including yearly subscription costs, implementation costs, and data migration costs.
  • More than 100 North American-based nonprofit organizations are current active users of the system.
  • The system allows nonprofits to:
    • Easily view and update contact information and all interactions on a constituent record.
    • Create a variety of online forms (donation, event, questionnaire, etc.).
    • Process online payments via a native payment gateway or a pre-existing integration with a third-party payment gateway.
    • Create and collect data from email marketing campaigns, either via a native tool or through a pre-existing integration with a third-party tool.
    • Manage and report on both online and direct mail fundraising campaigns.
    • Track fundraising metrics on a dashboard.
    • Export transaction data in a format compatible with accounting software.

Define Vendors to Be Included

Based on a preliminary scan of the marketplace, we developed a list of 47 systems that were potential fits for the report. We distributed invitations to vendors to fill out a preliminary survey about their systems, pricing, number of clients, and key features in order to be considered for inclusion. We emailed the invitation directly to individual contacts at all vendors included in the last edition of the report, vendors who have contacted Tech Impact since that edition to let us know they’d like to be included, and vendors who were already known to Tech Impact staff or the SMEs listed in Appendix C: Authors and Contributors. We reached out to vendors for whom we did not have an email contact via general inquiry forms or email addresses listed on their website, messages to the software’s Facebook page administrator, or messages to the sales or marketing staff via LinkedIn. We sent multiple reminder emails or messages to follow up with vendors who did not respond to our initial inquiries.

In all, 33 vendors responded to our inclusion survey. From those, we identified 23 systems that met our criteria.

NOTE: This list of vendors was created completely independently from the process of soliciting any vendor for funding. We employ a “firewall” by which our fundraising staff works without any visibility to the research staff, and vice versa, and without coordination.

Update Evaluation Criteria

In April and May 2020, Tech Impact Idealware solicited feedback from SMEs, nonprofit fundraising staff, and representatives of vendors selected as  “best value systems” in our 2017 report to seek their input on useful changes and additions to the criteria we used to review the systems in the 2017 report. We also looked at feedback from readers as to how they used the information in the report and held internal conversations about the report’s structure and how to better convey our intentions about how the report should be used. Based on this input and the internal conversations, we restructured the evaluation criteria and added new considerations to take into account in system reviews.

Complete Summary Reviews

In June and July 2020, Tech Impact Idealware conducted half-hour demos of all 23 systems identified from the preliminary survey to include in this report. Each vendor was sent a list of high-level tasks to be demonstrated, which were designed to investigate the factors most often identified as critical to a nonprofit’s fundraising program. Based on these summary reviews, we sorted the systems into 12 use case categories and wrote a paragraph summarizing each system’s features.

Each summary paragraph was sent to the system vendor (or official representative) to allow them to flag errors, and revised to ensure there were no inaccuracies. Vendors did not have final approval over their own review, but we allowed them to opt for us to not publish their review at all. None of the vendors chose this option.

Identify the Use Case Representatives

We selected one system from each of our 12 use cases to review in more detail. Representative systems were chosen based on the strength of their functionality related to the use case, overall strength as a fundraising system, ease of use, and position in the marketplace.

Complete Extended Reviews

For each of those 12 systems, we conducted 90-minute demos, during which we reviewed each system against the functionality listed in the evaluation rubric. We sent the review text based on these criteria to the vendors to allow them to flag errors, and revised them to ensure there were no inaccuracies. Vendors did not have final approval over their own reviews.

The new evaluation rubric (detailed in the next appendix), looks at 41 core functions, divided into 10 categories. Our extended profiles detail the systems’ key strengths and weaknesses in each of the 10 categories.  Each function has specific criteria sorted into three levels: Standard, Enhanced, and Advanced. For each function and level, systems receive one of the following ratings: Does Not Meet, Partially Meets, Meets.