Vibrant science engagement activities are conducted with diverse audiences across all  Australian states and territories by enthusiastic and committed providers, many with support from a federal project, Inspiring Australia. The aims of the Inspiring Australia program cover a range of objectives, such as raising awareness or understanding about a particular topic or encouraging young people to pursue science studies or careers. As events are unique, communicating diverse topics to different audiences in a variety of ways, evaluation needs also vary, making a ‘one-size fits all’ approach problematic. In this paper, we report development of evaluation resources designed to enable crossprogram evaluation of impacts. The resources were also intended to help a broad spectrum of event organisers in different Australian states and territories develop effective and efficient evaluation strategies. Development of evaluation resources for this project was influenced by the Theory of Planned Behaviour. We drew on resources such as The Framework for Evaluating Impacts of Informal Science Education Projects supported by the USA’s National Science Foundation and the Inspiring Learning  Framework developed by the Museums, Libraries and Archives  Council in the United Kingdom. Common questions in the survey templates were chosen to cover all five of the Generic Learning Outcomes defined in the Inspiring Learning Framework (Enjoyment, inspiration, reativity; Attitudes & values; Skills; Knowledge & understanding; and Activity, behaviour & progression).
Evaluation resources developed in this project included generic templates for online and paper-based surveys appropriate for a variety of events and target audiences. These surveys represent the middle ground, aimed at maximising feedback and comparability across a range of events. The resources include  suggestions for other methods of more timeconsuming evaluation for those who want greater depth or simpler evaluation which aims for a higher response rate at the expense of depth. We will present preliminary results from evaluations using these different tools. We will discuss the key measurables that we have used in order to characterize effectiveness of events against specific objectives. We will facilitate a discussion and ask for feedback about additional key measurables that are being developed.

">
 [PCST]
PCST Network

Public Communication of Science and Technology

 

Better ways to determine impact of science outreach to help define and guide best practice

Nancy Longnecker   University of Western Australia, Australia

Jo Elliott   University of Western Australia, Australia

Mzamose Gondwe   University of Western Australia, Australia

Vibrant science engagement activities are conducted with diverse audiences across all  Australian states and territories by enthusiastic and committed providers, many with support from a federal project, Inspiring Australia. The aims of the Inspiring Australia program cover a range of objectives, such as raising awareness or understanding about a particular topic or encouraging young people to pursue science studies or careers. As events are unique, communicating diverse topics to different audiences in a variety of ways, evaluation needs also vary, making a ‘one-size fits all’ approach problematic. In this paper, we report development of evaluation resources designed to enable crossprogram evaluation of impacts. The resources were also intended to help a broad spectrum of event organisers in different Australian states and territories develop effective and efficient evaluation strategies. Development of evaluation resources for this project was influenced by the Theory of Planned Behaviour. We drew on resources such as The Framework for Evaluating Impacts of Informal Science Education Projects supported by the USA’s National Science Foundation and the Inspiring Learning  Framework developed by the Museums, Libraries and Archives  Council in the United Kingdom. Common questions in the survey templates were chosen to cover all five of the Generic Learning Outcomes defined in the Inspiring Learning Framework (Enjoyment, inspiration, reativity; Attitudes & values; Skills; Knowledge & understanding; and Activity, behaviour & progression).
Evaluation resources developed in this project included generic templates for online and paper-based surveys appropriate for a variety of events and target audiences. These surveys represent the middle ground, aimed at maximising feedback and comparability across a range of events. The resources include  suggestions for other methods of more timeconsuming evaluation for those who want greater depth or simpler evaluation which aims for a higher response rate at the expense of depth. We will present preliminary results from evaluations using these different tools. We will discuss the key measurables that we have used in order to characterize effectiveness of events against specific objectives. We will facilitate a discussion and ask for feedback about additional key measurables that are being developed.

A copy of the full paper has not yet been submitted.

BACK TO TOP