>>> DOWNLOAD a printable pre-conference workshop program
Having trouble seeing the full overview table? Switch to mobile version here.
View Sunday program here.
|9am–12:30pm WORKSHOP PROGRAM|
Collaborative Outcomes Reporting (COR): a participatory impact evaluation approach (full day)
Dr Jess Dart;
Master class: evaluation frameworks in complex situations (full day)
Developing monitoring and evaluation frameworks (full day)
Interacting with stakeholders in the evaluation landscape: strategic methods and tools to make a difference (full day)
Zita Unger; Anthea Rutter
COMMISSIONERS OF EVALUATION
|1:30–5pm WORKSHOP PROGRAM CONTINUED|
Collaborative Outcomes Reporting (COR): a participatory impact evaluation approach
presented by Dr Jess Dart and Zazie Tolmer; Clear Horizon Consulting
FULL DAY – BEGINNER/INTERMEDIATE
Collaborative Outcomes Reporting (COR) is a participatory approach to impact evaluation based around a performance story that presents evidence of how a program has contributed to outcomes and impacts that is then reviewed by both technical experts and program stakeholders.
Developed by Jess Dart, COR combines contribution analysis and Multiple Lines and Levels of Evidence (MLLE), mapping existing data and additional data against the program logic to produce a performance story. Performance story reports are essentially a short report about how a program contributed to outcomes. Although they may vary in content and format, most are short, mention program context and aims, relate to a plausible results chain, and are backed by empirical evidence (Dart and Mayne, 2005). The aim is to tell the ‘story’ of a program’s performance using multiple-lines of evidence.
COR is characterised by the inclusion of processes of review by experts and stakeholders, sometimes including community members, to check for the credibility of the evidence about what impacts have occurred and the extent to which these can be credibly attributed to the intervention. It can be used with complex and multi-partner programs across all sectors.
The objectives of this workshop are to understand the major steps involved in COR and why they are important. It is pitched at the beginner to intermediate level. After providing an overview of the COR technique, the workshop will follow the process steps used in COR with real case studies. Participants will be invited to participate in an experiential process of analysing qualitative and quantitative data and video footage to engage with the steps involved. The training will include a mix of presentation, and small group work. All participants will be given a comprehensive set of workshop notes.
About the presenter
Jess Dart specialises in the evaluation and design of programs with complex, intangible outcomes. Jess's deep expertise in program logic and theory of change extends over a wide range of contexts including: health; education; gender equity; environment; community development; agriculture; law and justice; disaster risk reduction; climate change and many other sectors. She has led program logic processes for local government, State governments, NGOs, regional organisations, philanthropic organisations; private sector companies and across 15 different countries. She is also a highly demanded facilitator.
Zazie Tolmer is qualified in evaluation and international development and has facilitated over a hundred program logic workshops. She has worked in a wide variety of sectors in Australia and internationally. In the last few years Zazie has worked in the capacity of design facilitator; using program logic and theory of change processes to develop complex social change programs internationally and in Australia.
Master class: evaluation frameworks in complex situations
presented by Dorothy Lucks, SDF Global
FULL DAY – INTERMEDIATE/COMMISSIONERS OF EVALUATION
The demand for evaluation is growing in public, private and community sectors – and often with a combination of stakeholders. Evaluation is covering many sectors – and now often intersections between sectors (e.g. public mobility and infrastructure; employment and mental health; climate change and urban development). Evaluations are stepping beyond project and program evaluations into policy evaluation and meta-thematic and global evaluations. This means that commissioners of evaluations, evaluators and technical specialists within evaluation teams are increasing faced with complex situations where they are asked to apply evaluation principles and approaches. The evaluators toolbox is growing but the evaluation framework is the critical “box” to determine what tools are required for a specific evaluand.
The purpose of this master class is to strengthen understanding and deepen capability those interested in developing and using evaluation frameworks in complex situations. Participants would be encouraged to bring real examples that they have experienced and share their positive and negative experiences and challenges with other participants.
The target group for this master class is intermediate level evaluators or commissioners of complex evaluations.
About the presenter
Dr. Dorothy Lucks is the Executive Director of SDF Global (Sustainable Development Facilitation. She has extensive experience in complex sustainable development processes and evaluations that require engagement with multiple stakeholders. She has facilitated many professional development workshops in evaluation, most recently at the EvalPartners Global Forum in Nepal in relation to Evaluation of the Sustainable Development Goals.
Developing monitoring and evaluation frameworks
presented by Anne Markiewicz, Anne Markiewicz and Associates
FULL DAY – BEGINNER/INTERMEDIATE
The development and implementation of monitoring and evaluation frameworks at the organisational, initiative and program level are important processes to adopt in order to provide an indication of results achieved and to resource organisational learning. The monitoring and evaluation framework defines the parameters of routine monitoring and periodic evaluation that will take place over the life of a program or an initiative. The workshop provides participants with useful, step by step practical guidance for developing a monitoring and evaluation framework, supported by relevant background and theory. It presents a clear and staged conceptual model, discusses design and implementation issues and considers any barriers or impediments, with strategies for addressing these. Participants will learn the format and approach for developing a monitoring and evaluation framework, the range of techniques and skills involved in its design and implementation and develop an appreciation of the parameters of the tasks involved and how to approach them.
Participants will learn:
- the value and purpose of investing in and developing monitoring and evaluation frameworks
- the participatory approach and processes involved in developing such frameworks
- the steps and stages involved and the suggested 'Table of Contents' for constructing a monitoring and evaluation framework
The trainer will alternate between use of a PowerPoint presentation and small group interactive work. The workshop follows a case-study approach and involves participants in the development a monitoring and evaluation framework for the case-study. In this way, the approach to training is participatory and hands-on while still conveying sufficient theory and context.
About the presenter:
Anne Markiewicz has 20 plus years evaluation experience with a specialisation over past decade in developing monitoring and evaluation frameworks. Anne is a past board member of the AES Board. She has received two awards for excellence in evaluation from the Australasian Evaluation Society. These were the 'Indigenous Evaluation Award' (2008) and the 'Outstanding Contribution to Evaluation Award' (2013). She was made a Fellow of the Australasian Evaluation Society in 2015, in recognition of her contribution to evaluation and to the Society. Anne has authored three articles on evaluation that have been published in the AES Journal, and has recently co-authored a text book, Developing Monitoring and Evaluation Frameworks published by SAGE. Anne has presented this workshop a lot at the AES, AEA and in Papua New Guinea.
Interacting with stakeholders in the evaluation landscape: strategic methods and tools to make a difference
presented by Zita Unger, Ziman and Anthea Rutter, The Centre for Program Evaluation, University of Melbourne
FULL DAY – BEGINNER/INTERMEDIATE
This workshop will help participants appreciate how evaluation can be strategic in their organisation, and how this differs from traditional approaches that fail to ask strategic questions. Participants are introduced to a variety of tools that support a strategic evaluation approach and are familiarised with the facilitator’s model, Strategic and Tactical Evaluation Management (STEM).
The workshop will include: What are strategic questions and how they make a difference; How evaluation can be strategic in the organisation; Overview of developing indicators and credible measures for return on investment, including social return; Skills development of STEM Stakeholder, Mapping and Stakeholder Information Needs approach, which supports strategic evaluation; Practical and case study examples; and Implications for social media.
- provide a conceptual framework for strategic evaluation
- encourage participants to consider the benefits of strategic evaluation in their workplace
- awareness of barriers and enablers for implementation.
The workshop is highly interactive, providing opportunities for group discussion and exercises based on practical tools and scenarios. Take away reference materials are included. Hashtag and twitter handle will be provided for the workshop.
The workshop will be directed towards those working in the corporate, non-profit and government sectors. It is pitched at beginning/intermediate levels.
About the presenters
Dr Zita Unger is an evaluator, educator and entrepreneur, drawing on 20 years extensive knowledge of organisational development, business and governance experience. She was founding director of an evaluation consultancy, developing an online survey management tool, which was recipient of the AES Evaluation Policy and Systems Award. Zita is a Fellow of the Australasian Evaluation Society.
Anthea Rutter is a Research Fellow in the Centre for Program Evaluation at The University of Melbourne. She has advanced expertise in interviewing techniques, analysis and interpretation of data. Anthea is a founding member of the Australasian Evaluation Society and a current serving Victorian Committee member.
Zita and Anthea are joint recipients of the AES Evaluation Training and Service (ET&S) Award for outstanding contribution to the evaluation profession. Both have regularly presented at the American Evaluation Association (AEA) Conferences.