>>> DOWNLOAD a printable pre-conference workshop program
Having trouble seeing the full overview table? Switch to mobile version here.
View Saturday program here.
|9am–12:30pm WORKSHOP PROGRAM|
Communicating evaluation effectively (full day)
Planning an evaluation that works (full day)
Nea Harrison, Carol Watson
Connecting systems thinking to evaluation practice (full day)
Janice Noga> Details
Outcome mapping: addressing the dynamics of change in complex environments (full day)
Ziad Moussa> Details
|1:30–5pm WORKSHOP PROGRAM|
Kathryn Newcomer continued
|Nea Harrison, Carol Watson continued||
Janice Noga continued
|Ziad Moussa continued|
Communicating evaluation effectively
presented by Kathryn Newcomer, Professor and Director, The Trachtenberg School of Public Policy and Public Administration, The George Washington University, Washington, DC
FULL DAY – BEGINNER/INTERMEDIATE
The use and usefulness of evaluation work is highly affected by the effectiveness of reporting strategies and tools. Care in crafting both the style and content of findings and recommendations is critical to ensure that stakeholders understand and value the information provided to them. Skill in presenting sufficient information — yet not overwhelming the audience — is essential to raise the likelihood that potential users of the information will be convinced with both the relevance and the credibility of the evidence provided to them. This course will provide guidance and practical tips on communicating about evaluation findings. Attention will be given to the selection of appropriate reporting strategies/formats for different audiences and to the preparation of: effective executive summaries; clear analytical summaries of quantitative and qualitative data; user-friendly tables and figures; discussion of limitations to measurement validity, generalizability; causal inferences, statistical conclusion validity, and data reliability; and useful recommendations. The class will include some group exercises and cases.
Learning objectives include developing skills in:
- planning during evaluation processes to communicate effectively with stakeholders
- considering how 'evidence' may be transmitted to inform decision-making
- conveying the methodological integrity of evaluation work
- formulating feasible and actionable recommendations
- communicating effectively about evaluation work in the:
- scope and methods
- major findings
- quantitative and qualitative data analyses
About the presenter
Dr Kathryn Newcomer is the Director of the Trachtenberg School of Public Policy and Public Administration at the George Washington University where she teaches graduate level courses on public and nonprofit program evaluation, and research design. She is a Fellow of the National Academy of Public Administration, and currently serves on the Comptroller General’s Educators’ Advisory Panel. She served as an elected member of the Board of Directors of the American Evaluation Association (AEA) (2012–2015), and will begin service as AEA president in January 2017. She routinely conducts research and training for federal and local government agencies and nonprofit organizations on performance measurement and program evaluation, and has designed and conducted evaluations for many U.S. federal agencies and dozens of nonprofit organizations.
Kathryn has published five books, including The Handbook of Practical Program Evaluation (4th edition 2015) and Transformational Leadership: Leading Change in Public and Nonprofit Agencies (June 2008), a volume of New Directions for Public Program Evaluation, Using Performance Measurement to Improve Public and Nonprofit Programs (1997), and over 60 articles in journals including the Public Administration Review and the American Journal of Evaluation.
Kathryn served as President of the National Association of Schools of Public Affairs and Administration (NASPAA) for 2006–2007. She has received two Fulbright awards, one for Taiwan (1993) and one for Egypt (2001-04). She received the Elmer Staats Award for Achievements in Government Accountability, awarded by the National Capital Area Chapter of the American Society for Public Administration (2008). She has lectured on performance measurement and public program evaluation in Ukraine, Brazil, Italy, Israel, the United Arab Emirates, Poland, Costa Rica, Egypt, Taiwan, Colombia, Nicaragua, and the UK.
Kathryn earned a B.S. in secondary education and an M.A. in Political Science from the University of Kansas, and her Ph.D. in political science from the University of Iowa.
Planning an evaluation that works
presented by Nea Harrison, Pandanus Evaluation & Planning Services and Carol Watson, Carol D Watson, Planning and Evaluation Services
FULL DAY – BEGINNER
This practical one-day workshop provides an introduction to key evaluation concepts, methods and planning processes. It will enable participants to get started in evaluating their own programs and projects in collaboration with key stakeholders and beneficiaries.
By the end of the workshop participants will have:
- considered the critical issues in planning and conducting an evaluation and reporting results, including the importance of context, values and ethics
- considered key stakeholders and their information needs
- developed a program logic model
- developed evaluation questions
- identified indicators of succes
- considered appropriate data collection and reporting methods
- explored ways to ensure evaluation findings are useful and used
The workshop draws on utilization focused, participatory and empowerment evaluation theories and methods. It introduces participants to evaluation concepts and steps participants through the participatory planning processes involved in developing an evaluation plan.
The workshop will be conducted in a participatory manner using adult learning principles. Workshop facilitators will use a range of workshop techniques such as short presentations, including illustrative examples from practice, and large and small group discussions that draw on the knowledge and experiences of the group. Participants will work through a case study in small groups to develop a program logic and plan. Participants will be provided with supporting handouts, planning worksheets and links to online evaluation websites and resources. They will take away the necessary tools to get them started on an evaluation plan for their own program/project.
The workshop is aimed at people who want to develop the knowledge and skills to meaningfully involve stakeholders in planning a rigorous and useful program evaluation, and is suitable for people who are new to evaluation or people with some evaluation experience who wish to increase their knowledge of the participatory process of planning a systematic program evaluation.
About the presenters
Nea Harrison is Director of Pandanus Evaluation & Planning Services. She conducts high quality evaluation, participatory planning and evaluation capacity development work for a wide range of government, non-government and community based agencies throughout Australia and internationally. She has a background in research, management and program and policy development in the education, health, and social service sectors, a Master of Education (Honours), University of New England, a Graduate Diploma in Education, and a Bachelor of Arts, University of Sydney.
Nea, was awarded the 2012 Australasian Evaluation Society (AES) Community Development Evaluation Award for Excellence for the Participatory Evaluation in partnership with the The Australian Red Cross (NT) Communities for Children Program and the Palmerston and Tiwi Islands Communities for Children Local Committees.
Carol Watson is an experienced service and program planner and evaluator. She has a 30 year history working in Aboriginal health, public health, alcohol and other drugs, mental health and housing through her various roles in the NT Department of Health, in NSW Health, as a senior researcher for the Cooperative Research Centre for Aboriginal Health and as a consultant (Carol D Watson Planning and Evaluation Services). She has a Master of Public Health, UNSW, as well as PhD, University of Chicago and BSc (Honours), University of Washington.
Nea and Carol conduct two-day, one-day and half-day versions of this practical workshop. Evaluation feedback indicates that the workshop is engaging, clear and logical, and makes the process of developing a sound evaluation plan meaningful and achievable.
Connecting systems thinking to evaluation practice
presented by Janice Noga, Pathfinder Evaluation and Consulting, Cincinnati, Ohio, USA
FULL DAY – INTERMEDIATE/ADVANCED
Systems thinking helps us as evaluators to understand the world in all its diversity in ways that are practical, comprehensive, and wise. A systems approach to evaluation is particularly useful in situations where rigorous rethinking, reframing, and unpacking complex realities and assumptions are required. Evaluations based on systems thinking generate rich descriptions of complex, interconnected situations that help stakeholders build a deeper understanding and inform choices for subsequent action.
The purpose of this workshop is to:
- introduce fundamental concepts needed to utilize systems thinking in evaluation practice
- provide opportunities to make concrete connections between systems concepts and the application of systems thinking to evaluation practice
Workshop attendees will learn:
- core systems concepts and how they relate to evaluation practice
- how to apply systems thinking to evaluation practice
- how to use systems perspectives to:
• deepen contextual understanding of programs
• clarify program theory and logic
• expand stakeholder/client understanding of programs and their outcomes
The workshop will address the following topics:
- What is systems thinking?
- connecting core systems concepts to evaluation – boundaries, perspectives, and interrelationships
- addressing complexity and emergence using systems thinking
• How do we balance the intuitive with the logical in using systems thinking to evaluate complex situations and ecologies?
• How can systems thinking be used to better understand the nature and dynamics of complex programs and to inform evaluation of program impact and effectiveness?
- synthesis and discussion
Four questions will organise learning for the day:
- Cognition: How do I understand this?
- Perception: What does this mean to me?
- Relevance: Why does this matter?
- Use: How can I apply this to practice?
This workshop is targeted to intermediate/advanced evaluators interested in applying systems thinking in evaluation practice.
About the presenter
Jan Noga is an independent consultant based in the United States who has taught numerous workshops for non-profit, community, and government groups as well as graduate courses in systems thinking, research methods and techniques, and survey design and analysis. As a program evaluator, Ms Noga has planned and conducted both large and small-scale evaluations and provided organizational consulting and capacity building support to clients in areas including systems thinking, logic modeling, and evaluation design and implementation. She is particularly interested in the use of systems theory and thinking as a foundation for planning, implementation, and evaluation of change efforts in the human service and education arenas.
Outcome mapping: addressing the dynamics of change in complex environments
presented by Ziad Moussa, President of the International Organization of Cooperation in Evaluation (IOCE)
FULL DAY – INTERMEDIATE/ADVANCED
Development practitioners often face the challenge of capturing the richness of their 'development journey' towards human, social and environmental wellbeing. Outcome mapping is an approach to planning, monitoring and evaluation that helps understanding this journey by understanding and measuring the changes in the behavior of the people with whom a development initiative works most closely.
It is based on the idea that, in order to succeed, an intervention needs to involve multiple stakeholders, each with its own particular commitments, interrelationships and definitions of success. It offers a template for creating mutually supportive intervention strategies and connects ‘outputs’ to ‘outcomes’ (defined as changes in behavior influenced by an intervention) by focusing on the patterns of action and interaction among stakeholders. As such, Outcome Mapping provides the basis for engaging participants in measuring, learning from and adapting their desired outcomes.
Outcome Mapping is divided into three stages. The first stage, Intentional Design, helps a program establish consensus on the macro level changes it will help to bring about and plan the strategies it will use. It helps answer four questions: Why? (What is the vision to which the program wants to contribute?); Who? (Who are the program's boundary partners?); What? (What are the changes that are being sought?); and How? (How will the program contribute to the change process?). The second stage, Outcome and Performance Monitoring, provides a framework for the ongoing monitoring of the program's actions and the boundary partners' progress toward the achievement of outcomes. It is based largely on systematized self-assessment. The third stage, Evaluation Planning, helps the program identify evaluation priorities and develop an evaluation plan
Over a full day workshop, Outcome Mapping Learning Community (OMLC) Steward Ziad Moussa will:
- Walk the participants through the seven steps of the "Intentional Design" stage of Outcome Mapping, including several hands-on exercises, and introduce the outcome & performance monitoring and evaluation planning stages.
- Introduce the contribution of Outcome Mapping to systems thinking and complexity-based approaches to planning, monitoring and evaluation
- Introduce the contribution and uses of Outcome Mapping in building a "Theory of Change" model for a project or a program
He will be using the standard Outcome Mapping training material as well as material from the newly developed Outcome Mapping Practitioners guide. More on Outcome Mapping can be found on www.outcomemapping.ca, and participants are strongly encouraged to download and familiarize themselves with the Outcome Mapping guide book (http://www.outcomemapping.ca/download/OM_English_final.pdf).
About the presenter
Ziad Moussa is the President of the International Organization of Cooperation in Evaluation (IOCE), and the Co-Chair of EvalPartners. The IOCE represents national and regional Voluntary Organization for Professional Evaluation (VOPEs) in the Americas, Africa, Asia, Australasia, Europe, the Commonwealth of Independent States, and the Middle East. It strengthens international evaluation through the exchange of evaluation methods, theories and practice, and promotes good governance and recognition of the value evaluation has in improving peoples’ lives. It is committed to cultural diversity, inclusiveness and bringing different evaluation traditions together in respect of that diversity.
EvalPartners is an innovative partnership to enhance the capacities of Civil Society Organizations (CSO) to influence policy makers, public opinion and other key stakeholders so that public policies are based on evidence, and incorporate considerations of equity and effectiveness. EvalPartners brings together over 45 international development agencies, regional evaluation networks, foundations, NGOs and others, The objective of the Initiative is to enhance the capacities of CSOs to engage in a strategic and meaningful manner in national evaluation processes, contributing to improved country-led evaluation systems and policies that are equity-focused and gender equality responsive.
Ziad has over 20 years of experience in leading complex multi-country evaluations across the Global South in well over 40 countries and with almost every major donor on the circuit. He is credited for the Arabization of the reference book Outcome Mapping: Building Reflection and Learning into Development Programs