We are happy to provide consultancy support for organisations carrying out their own evaluations or to design and deliver full evaluations.
Completed projects include reviews of program design, as well as implementation and outcome evaluations of projects, policies and programs, including interdepartmental strategies. As well as data collection phases, projects may include literature reviews, program logic development or review, and the development of coherent integrated monitoring and evaluation frameworks.
Monitoring & Evaluation Frameworks
Developing an evaluation framework means formulating a planned approach to monitoring and evaluating your program or strategy. We help you clarify the program to be evaluated, with its activities and success factors, outcomes, and context. Then we guide you to clarify and design, in turn, your purpose and audience, appropriate scale, evaluation questions, performance criteria and indicators, and finally data sources and methods.
Program Logic
A program logic sets out an intervention’s underlying theory of change, and its accompanying activities, outcomes, and assumptions. Developing an explicit program logic ensures that all stakeholders have a common understanding of a program’s design and objectives. Shelby helps clients develop or review their program logic to describe new programs or to clarify existing programs. This program model may then be used as a basis for developing an integrated monitoring and evaluation framework.
Program Evaluation
Program evaluation is a systematic review of how well a program is being implemented (formative evaluation) or is achieving its objectives (summative evaluation). Typically, evaluations consider a program’s quality, effectiveness and appropriateness. Shelby takes a pragmatic approach, designing evaluations that fit the scope and organisational context of the program. We select data collection methods and develop rigorous and robust tools that will elicit the required data. We have the capacity to analyse qualitative and quantitative data and synthesise findings and respond to the evaluation questions.
Literature Reviews
By sourcing and summarising the most recent research on a topic, Shelby identifies and documents contemporary theories, evidence and issues, models of ’good practice’ as well as identifying gaps or areas for further exploration. Shelby has demonstrated ability in producing quality reviews, either as stand alone documents, or as part of a comprehensive program evaluation. These are used for developing or reviewing program designs, and evaluations.
Survey Development & Administration
Shelby specialises in the design, piloting and systematic administration of surveys that gather quantitative and qualitative data from a wide range of stakeholders. These include surveys that gauge stakeholder perceptions, document stakeholders’ views and opinions, and/or identify issues which have an impact on the success of a program. Responses are generally collected via internet or phone.
Stakeholder Consultation & Focus Groups
Stakeholder interviews or focus group discussions on their own or with other methods elicit rich and qualitative data. For example, they might be used to explore a subject to inform survey design or may be part of a mixed methods evaluation. Shelby has extensive experience conducting interviews and focus groups with a wide variety of stakeholders, including heads of organisations and vulnerable clients, in a broad range of sectors.
Case Studies
Case studies are especially illuminating when investigating a program in-depth in its real world context. They are particularly good at capturing complexity, especially where context interacts with the case. We carefully design our studies, incorporating multiple sources and types of evidence to provide rich, robust data, focussed as planned on exploring, describing or explaining.
Qualitative & Quantitative Data Analysis
Shelby takes a pragmatic mixed method approach to gathering evidence, tailoring our data collection methodologies and methods to suit project context and constraints. Where appropriate we seek different forms and sources of data and are adept at organising, analysing and interpreting what we obtain systematically to provide credible evidence and practical recommendations. We routinely work with both qualitative and quantitative data, believing that they have important and complimentary roles. Thus, we are skilled at data cleaning, coding, and analysis. We utilise methods from simple frequency tabulations and averages, to multiple regression, inferential statistics, and cluster and factor analysis, to fit data and project scope, calling on our skilled associates as required.