As you know, at PEPY, we are intent on improving how we understand and measure our impact. Recently we appointed one of our staff as a full-time Monitoring and Evaluation Officer, and soon we will be welcoming a new Monitoring and Evaluation Consultant to support his work, and the work of our program managers going forwards.
This year we were also pleased to welcome Michael Brian, a student from Touro University, California, who was conducting a field study in Monitoring and Evaluation for his Masters in Public Health. Michael volunteered for PEPY for ten weeks, working with our Community Development team to develop evaluation tools. They tackled tricky questions about how to measure whether young people in Chanleas Dai are more “empowered” through participating in our clubs.
With ideas such as “empowerment” definitions can be argued over for a long time, and it can be truly difficult to develop ways that measure such an idea that take into account the uniqueness and unpredictability of human nature. However, that doesn’t mean we should just give up on any hope of understanding and evaluating what these concepts mean in real life. Through his research, Michael identified five key factors critical to understanding “empowerment”:
Michael shares his experiences of how, with the team, he developed surveys to measure PEPY’s impact and supported our team through training sessions and systems improvement:
“My field study with PEPY presented an opportunity to apply the principles of Program Planning and Evaluation. Despite postponement of LogFrame development until later in the year, I was able to gain experience working on a logic model for developing “life skills” for the Child to Child project and apply it by creating, in collaboration with PEPY’s Community Development Program (CDP) team, the children and facilitator surveys. We utilized a participatory approach to construct a comprehensive survey scale to measure empowerment in adolescents in Chanleas Dai commune. CDP facilitators/teachers pilot tested the children’s survey and many translational issues were revised and edited. Research Methodology was utilized in researching validated surveys, data collection and data analysis, and in the creation of a template for a manual. Professionalism and Ethics were observed, for example, in respecting the various stakeholders during the participatory process of survey and M&E system creation. The children’s survey was given final approval by mid-October and the facilitator survey was completed early November.
CDP staff, managers and PEPY’s newly appointed Monitoring and Evaluation Officer participated in trainings on administering, data collection and data analysis of both surveys. An M&E manual was created partly using “dummy” data from one of the surveys, and included an introduction to M&E data collection/ administration, data analysis such as inferential statistics, etc. The training sessions reinforced the manual as a tool to explain external and internal validity, demonstrating data input/analysis on spreadsheets and as a partial guide for PEPY’s future M&E implementation plan. The final presentation gave an overview of M&E, highlighting the accomplishments of the last ten weeks, and challenging the staff to continue working collectively to utilize an effective and efficient Monitoring Information System (MIS) as a continual feedback loop to maximize informed decision making.
I want to express my profound respect and gratitude to all members of the CDP team & management and to all the volunteers & support staff who made this field study the richly rewarding experience, both personally and professionally, that it truly has been.”
Please find the surveys, evaluations tools and presentation referenced above available below. Don’t hesitate to get in touch at email@example.com if you have any questions or would like further information.
Facilitator Survey Questionnaire / Children’s Survey Questionnaire / Children’s Survey Evaluation / Children’s Survey Data Analysis / Monitoring & Evaluation Presentation