Powered by the Association of Science and Technology Centers (ASTC)


At the start of planning a Dialogue & Deliberation event, organizers must consider program evaluation from the outset through completion. Organizers should be able to reflect on and evaluate the program’s success throughout all stages: early planning and preliminary activities, project governance and program development, the Dialogue & Deliberation program itself, and, finally, the reporting, impacts, and outreach stages.

Co-designing the evaluation plan with the community partner is essential to ensure that the measurement strategies and outcomes assessed align with the values of the community partner and are relevant to the community itself. For example, organizers might adopt a Team-Based Inquiry process at the early stages of the event planning, which can aid significantly in reflecting on where goals were achieved, successes were seen, and where there are areas for improvement (see guide and training videos).

Synthesizing and communicating findings can inform the planning process for future events and improve Dialogue & Deliberation programming for future iterations. Sharing the findings of evaluation can also help prompt strategic thinking for how to increase participation from the community, improve facilitation techniques for engagement, and make better use of subject matter experts and framing materials, as well as how to think more deeply about creating accessible and inclusive spaces for such events. Finally, discussing the outcomes of evaluation can also build trust and strengthen community partnerships by demonstrating accountability and a willingness for critical reflection and improvement.

Key Resources

These resources offer additional guidance and tools to support event organizers in developing and implementing evaluation programs to understand impacts and outcomes relevant to their community partners.

“Commit to Ongoing Learning and Improvement” is one of eight principles in this practitioner’s guide to pursuing ethical, equitable, and effective public engagement to inform decision-making processes. View resource.

Provides a post-event survey template, training videos for survey collection, and a sample script to recruit survey participants. View resource.

An online rating tool that lets the public provide input on different kinds of meetings, processes, town halls, festivals, and online activities. View resource.

Illustrates the professional and public impacts of nanotechnology-related forums hosted by museums and universities by summarizing the evaluation data of the Nano & Society effort. View resource.

Evaluation is a key phase in Everyday Democracy’s Dialogue to Change model, and a subset of their resource library houses several evaluation planning tools and guides. View resource.

This comprehensive toolkit for designing public participation and engagement processes to improve democratic decision-making outlines key considerations in designing an evaluation plan. See “How do I evaluate a participatory process.” View resource.

This guide is aimed at helping staff at informal science education organizations develop, implement, and evaluate activities incorporating public dialogue and mutual learning strategies. See Chapter 5, “Evaluating Public Engagement Outcomes.” View resource.

This repository compiles useful tools to connect research and innovation with society through many threads, including public engagement, governance, and more. Search “evaluation” to find related tools and inspiring practices. View resource.

Sciencewise’s Guidance on Evaluating Projects offers a framework for assessing the quality of public dialogue, particularly relevant for events that involve government bodies as stakeholders. View resource.