Researchers within development often hope that their particular research can ultimately influence plan. But getting from research leads to policymaker persuasion is an ongoing struggle. Yesterday I heard insights on this point from Dasmine Kennedy associated with Jamaica’s Ministry of Education in addition to Albert Motivans from Equal Measures 2030. (I also gave my two cents. )
Kennedy made three points:
Policymakers are interested in learning about the return on investment, so they want to see cost-benefit analysis.
Explain research in apparent, non-technical language. “One of the greatest turn-offs will be to come and start talking about all the different research techniques. ”
Show how research aligns with policymaker’s policy initiatives. “I know you’re focusing on this. I know you’re excited about learning about the outcome of your initiative, but this is exactly what the data is saying. ”
Motivans highlighted the significance of creating data systems that will allow for that “right data at the right time” so that when there is a specific policy query (as in, “what are dropout rates? what drives them? ”), there are answers ready.
Of course (and this is me, not really Motivans), data collection is costly, and that requires prioritizing what we really want answered. In the context of training, this is an argument for regularly gathering student performance data (whether low-stakes or high-stakes): If we want to improve college student performance, then it’s hard to warrant not gathering regular, representative data on student performance.
This conversation was part of a longer panel on “Closing the Sex Data Gaps in Girls’ Training, ” including Kennedy, Motivans, Stephanie Psaki, Thoai Ngo, and myself, hosted by the GIRL Center on the Population Council. You can view the full panel below or in the Population Council’s Facebook page. The conversation on communicating to policymakers takes place between 42: 30 and 50: 00.
Early in the panel, I talk about a new initiative I’m working on (with the team) to understand the gender affects of learning interventions. Only about one-third of impact evaluations with understanding measures separate effects by gender. Recent work by JPAL demonstrates interventions to improve school access – even those not targeted to women – often have dramatically larger results for girls. We want to understand how learning interventions differentially affect boys and girls. (This function is supported by funded by Echidna Giving and the Umbrella Service for Gender Equality. )
Related links – these are all by me; you can share YOUR OWN related links in the comments!