Ensure appropriate data are being collected and tracked in safe and accountable ways

Where possible, it is ideal to build user profiles – of individuals and/or segmented groups – so as best to understand whether or not the digital solution is meeting their needs, as well as whether it is meeting project goals. Establishing measures to collect and analyse data responsibly is critical to protect users and the organization itself.

  • Determine appropriate data elements to collect. Not all possible data should be collected as this can overwhelm the system, often burden the users with additional tasks, and crucially increase the risk of infringement of users’ privacy. The appropriate data elements should be determined by the goals and success metrics for the digital solution and the project as a whole, balanced against the convenience and privacy of the user.

  • Ensure multiple monitoring ‘views’ in the digital solution. As an example, the GSMA (2017a) recommends having i) a user view, to understand how users engage with the digital solution overall; ii) a content view,to determine for example which content is accessed more or less, and which is abandoned halfway through viewing; and iii) a service view, which can show how different users journey through the entire digital solution or drop off at certain pain points. Drawing on such views, the GSMA found that male and female users of the 3-2-1 Service in Madagascar had very different user journeys, with varying content preferences and drop- off points.

  • Ensure privacy, security and responsible management of data. It is critical that best practices (United Nations, 2014) are followed to protect the data collected about users and to be completely transparent about its intended uses. Low-skilled and low-literate users especially may find it challenging to understand the process and implications of giving consent for their data to be used. Time must be taken to explain the rationale and consequence to the users.

  • Practise algorithmic accountability. Any organization conducting data analysis must ensure that the data as well as the analysis of it are open to public scrutiny. Of course this must be done in a way that does not compromise the privacy of the users. Further, the onus is on the organization to explain to the end-users – in easily understandable ways – the source of the data, the logic of the data analysis and the findings.

  • Complement automatically collected data with human feedback. Automatically collected data might not pick up on all the needs, concerns or views of the end-users. Face-to- face research – such as through focus groups and interviews – is very important to maintain the human element in the monitoring process and contribute to a more complete picture of usage. Data such as gender and education level can be captured in person when registering users. Many of the UNESCO-Pearson case studies involve intermediaries meeting regularly with end-users, providing further opportunities for data collection.

Include the full range of stakeholders in data analysis

Data is not only relevant for the developers of the digital solution while they manage and improve it. Analysed data can be presented to the actual users, from refugees to government officials, so they can use it for example to improve livelihood activities. The following activities are recommended:

  • Map and serve the data needs of the full range of user stakeholders. Different user groups have varying data provision roles and information needs. Khushi Baby provides a good example of how, as data is collected along the user chain, it is aggregated and analysed for district-level decision-making related to health administration. The mothers and CHWs are active participants in data generation for programmatic and policy interventions – in real time. In turn, they benefit from the data analysis, which informs the treatment they receive or administer. Each of the three user groups – mothers, CHWs and district officials – interfaces with appropriately designed technology: wearable necklaces, mobile data collection apps and web-based dashboards respectively.

  • Leverage big data generation by end-users for sector support. That end-users are low-skilled and low-literate, and interfacing with often very simple solutions, does not exclude the opportunity for data collection and complex analysis by solution providers, as with Crop-Specific Mobile Apps.

-Ensure quality control when low-skilled users are data generators. Steps need to be taken to ensure that when low-skilled and low-literate users generate data, or use a system for the benefit of others, there is sufficient quality control to protect the users and their beneficiaries. Such control can happen offline, for example, through intermediaries, or be automated, as with hearScreenTM.

Analyse big data to support sector interventions

By tracking farmer usage of each of the Crop Specific Mobile Apps in rural India, the company behind them can identify in which districts farmers need to diversify their crops, where they are diversifying but need guidance, and where new disease outbreaks are likely to be happening.
Such usage data can be sent to the cloud server via SMS, if needed, to ensure collection in low-connectivity districts. The farmers thus become rich data sources for government interventions triggered at a district
or state level. Collecting, analysing and passing on the data creates a potential revenue stream for the solution provider.

IMG_6aCropSpecificMobileApps.jpg

Automate quality control of usage and data generation

In most developing countries there is a dearth of trained professionals to ensure that children and adults receive auditory tests to monitor for hearing problems. hearScreenTM is an app that allows anyone with very limited training and a headphone set to conduct hearing tests. False positives are sent to the persons administering the test (the screeners), and tracked to see whether they record these as legitimate responses from the patient, to create an individual screener quality index. The index acts as a measure for quality control, and system reports inform supervisors about screeners who need further training.

IMG_6bHearScreen.jpg

01
03