Forgotten your password? 

26 March 2017
Evaluating research impact: Reporting on activities and telling the stories Print
Wednesday, 06 November 2013 00:00

Heat is North America’s primary weather-related killer of vulnerable citizens living in economically disadvantaged conditions. To address this concern, the Knowledge Mobilization Unit at York University (Toronto, Canada) supported a research collaboration between a York graduate student and a community centre in a low-income Toronto neighbourhood.

Through this research collaboration, Canada’s first heat registry was created in 2007 and in 2012 the City of Toronto released its Heat Registry Guide benefiting more than 2.5 million citizens by making it easier for neighbourhoods to track and provide services to vulnerable citizens on the hottest days of the year and lessening the burden on Canada’s healthcare system through prevention of heat-related emergencies. This is the impact university research can have when the university becomes more accessible and responsive to community partners.

Is this how you measure your knowledge mobilisation (= Research Uptake) activities? Do you tell success stories like this or do you report on activities? It is important to do both.

At York University we use a mix of both quantitative and qualitative metrics that illustrate how research can move to policy or practice by following a logic model designed for knowledge mobilisation.

Research that is co-produced with and disseminated to partner organisations is taken up by partners and implemented into policies, programmes and services that benefit citizens in local communities. We count activities associated with the conduct of the co-produced research and with its dissemination. But we tell stories of the implementation and impacts of this research when it has been used by partners in ways that create benefits for citizens.

Measures of the conduct of research include:

  • number of researchers, students and partners involved in the research
  • funding associated with the research

Measures of dissemination to partners/decision-makers include:

  • number of clear language research summaries
  • electronic media (i.e. web pages) developed to disseminate research to partners/decision-makers
  • number of knowledge exchange events with partners/decision-makers
  • social media analytics

Measures of uptake include:

  • invitations to present research to partner/decision-maker organisations
  • students involved in supporting partners/decision-makers assessing the research

Measures of implementation include:

  • stories of partners/decision-makers using research to inform new policies, services and programmes

Measures of impact include:

  • stories of partners/decision-makers releasing new policies, services and programmes informed by research
  • stories of communities and/or citizens served by new policies, services and programmes informed by research

We count activities (quantitative metrics) for research and dissemination. We tell stories (qualitative metrics) of implementation and impact.

If you want to demonstrate impact of your Research Uptake activities, you need to stay in touch with your partners and decision-makers and ask them what they did with the research you either co-produced with them or you disseminated to them. York’s Knowledge Mobilization Unit is systematically consulting with partners we worked with in 2008 and 2009. We are going back five years, because it can take that long for research to inform policy and practice, the Heat Registry story above illustrates.

Research Uptake is a long-term undertaking. Investing in capacity for Research Uptake and then tracing those activities over time will provide you with stories of the impact of your work on the lives of citizens in your communities.

How are you measuring your Research Uptake activities? Use the comment section below to start a discussion about evaluation and measurement of Research Uptake.

Dr David Phipps is the Executive Director at Research and Innovation Services, York University, Canada