News

February 17th, 2017

Tracking, summarizing and analyzing evidence: A database tool for documenting the evaluation process.

In Ottawa, on February 16, at the 2017 Annual Learning Event of the National Capital Chapter of the Canadian Evaluation Society, Victoria E. Díaz, Pierre Mercier, and Celine Pinsent, partners at DPM Research, presented a database tool they developped to track, summarize and analyze evidence in most evaluation projects.  

Program evaluations involve the compilation and analysis of varied data from multiple lines of evidence. To track, summarize and analyze large amounts of data from multiple data sources, we designed, developed and implemented an Evaluation and Analysis Database, that assists in racking data according to line of evidence, summarizing the data and capturing the analyses produced throughout the evaluation process. The database was tailored according to each specific evaluation's matrix (indicators, questions, data sources). The database provides an important tool for compiling data/evidence using a practical structure, that supports systematic data collection, as well as a robust weighted evidence approach for integrating all lines of evidence. In addition, once the evaluation has been completed, the database provides easy and fast access to all documentation supporting the evaluation process.

In this presentation, we discuss the database's main features, including its different components, such as navigation interfaces, data entry forms and predefined reports. The design process is also be presented, identifying specific challenges encountered during the conceptualization and their solutions. In addition, we demonstrate how the database format has been used during various recent evaluations, including the process by which data/evidence is entered, how evidence is analyzed and findings developed, and the capacity to revisit the analysis process for review and validation. The contribution of the database approach to triangulating and weighting findings from different lines of evidence to address evaluation questions is highlighted. As well, we indicate how the database can be adapted as a flexible tool that can be tailored for various evaluations according to scope, evaluation matrix and lines of evidence employed.