When planning an activity or intervention it is useful to look at the findings of existing research to see what might be an effective approach that could be adopted.  Existing research can highlight:

  • What effective interventions have been tried elsewhere?
  • What are the strengths and limitations of those interventions?
  • Are there good reasons for thinking that the interventions are likely to work for us, given our values, the impact we seek and the context in which we work?
  • If we decide to adopt a similar approach, what kind of things should we pay attention to as we implement and evaluate our work?

The Cabinet Office publication Quality in Qualitative Evaluation: A framework for assessing research evidence provides a number of suggested criteria to use when assessing the quality of research reports, whatever research methods have been used.  It recommends looking at the findings of the research first, then at features of the research process.

1. Findings and interpretation

  •  Are the findings based in the evidence presented, or do they draw inferences based on external material or opinion? If so, is this clearly stated?
  • Do the conclusions/findings make sense and have a coherent logic?
  • Does the research explain key concepts and definitions that it employs?
  • Does the research address the aims and objectives as set out in the brief or as redefined during the study?
  • What is the scope for drawing wider inferences from the findings and how well is this explained?
  •  Is there any reason to think that the evidence presented here may not be applicable in other contexts or may not apply to particular groups of people? (eg. urban/rural; men/women)
  • Is there any reason to think that important contextual or other factors may have altered since the research was undertaken?
  • Does the research identify the implications for policy and practice?

2.  Research design, sampling and data collection

  • Do all the elements of the research design, including the methods, help to meet the aims of the study?
  • Were the research methods used appropriate to the questions being asked?
  • How appropriate is the sample design or selection of cases?
  • Were the methods conducted properly – is there information on how the approaches were implemented? (e.g. response rates for surveys, sampling information; copy of questionnaires/questions used).

3.  Analysis

  • Has comprehensive, appropriate and accurate analysis been conducted? (e.g. level of analysis, treatment of missing data, misrepresentation of small samples/sub groups, good use of tables, figures and charts/graphics, etc.).
  • Are the issues considered from a range of perspectives?

4.  Reporting

  • How clear and coherent is the reporting?
  • Is there a clear link between the data, interpretation and conclusions?
  • Are the key messages highlighted and summarised?
  • Are the research outputs presented in an appropriate format for the audiences?
  • Reflection on the research
  • Are the assumptions, theoretical perspectives and values that have shaped the research made explicit?
  •  Is there evidence of openness to new and alternative ways of viewing the subject?
  • Is there awareness of the limitations of the approach, methods and evidence?

6.  Ethics and access

  • Were appropriate ethical guidelines adhered to? (e.g. confidentiality, anonymity, informed consent)
  • Is there evidence of sensitivity to the research context and participants?
  •  Did the research make provision to enable the participation of all relevant parties?
  • Did the researchers have sufficient level and type of experience to undertake the research?

7.  Audit

  • Has the research process been well documented?
  • Is there documentation and discussion of changes to the research design?
  • Are the main study documents included?

These criteria are not meant to be applied rigidly, but are an aid to informed judgement, which must also draw on professional experience and the weighting that you might wish to give to particular criteria.