All resources for Online Community Engagement for PVE
In this section you will find all the resources stored by this Hub. See instructions on how to use this page below:
1) Resources: appearing in white boxes, resources are shown in order of relevance (as indicated by % number). They also include the following features:
– Tags: for further filtering and sorting by categories from the i) diagnostic tool (in button text) and ii) keywords (in hyperlink)
– Arrow icon: to access resource
– Folder icon: to save resource to folder
2) Guidance cards: appearing in blue boxes, they offer guidance and background information on the topics you selected in the diagnostic tool.
3) Menu of icons on the right: additional functionalities to explore
You may download all the guidance information in one Guidance Document, where you will find all the guidance for setting up your OCEA for each type and phase of activity, type of resource, and target audience.
Monitoring & Evaluation
Monitoring and Evaluation is an important part of OCEAs. Evidence-based programming and learning through M&E allows us to improve the engagement of our audience over time.
If you offer content, it can be of great value to ask, in either surveys or focus groups, the quality of specific content, and any other questions to evaluate whether the objectives of the OCEA has been met. Improving your content is the number one way to improve your engagement, so knowing what works and what doesn’t is vital.
When running an OCEA you might expect specific change in the target audience, including knowledge/awareness, attitude, and/or behavior change, as a direct impact of your OCEA. If you are, you might want to use pre and post-OCEA surveys. You can see our list of survey tools here.
Monitoring and Evaluating of Virtual Exchange
This report examines the programmatic outcomes of Virtual Exchanges and provides a review of evaluations
Monitoring and Evaluation in E-Learning
This article provides five insights into M&E practices for E-learning based on the experience of Soliya and the William Davidson Insitute
Monitoring and Evaluatin: Common Pitfalls
This article provides an overview of common pitfalls in M&E with a focus on its online implementation.
Remote Developmental Evaluation: A Roundtable for Funders and Practitioners
This roundtable discussion aims to provide guidance for remote developmental evaluation for people with a background in embedded evaluation practices.
Remote Developmental Evaluation: A Guide for Funders and Practitioners
A guide for moving the embedded practices of Developmental Evaluation online, commisioned in response to COVID-19
M&E Thursday Talks
This inititative by DM&E provides new talks on M&E every Thursday, many of which are specific to online M&E
How to Calculate Net Promoter Score
This resource provides an introduction to Net Promoter Scores and provides a formula that can be used to calculate this score
Net Promoter Score Calculation – Surveymonkey
Instructions on how to calculate a Net Promoter Score, with information on how this could be done using SurveyMonkey
Monitoring and Accountability Practices for Remotely Managed Projects
This publication examines how remote monitoring is currently being utilized and the key challenges associated with it.