Skip to content

FASTR RMNCAH-N service use monitoring: Methodology documentation

Introduction to FASTR

The Global Financing Facility (GFF) supports country-led efforts to improve the timely use of data for decision-making, ultimately leading to stronger primary healthcare (PHC) systems and better reproductive, maternal, newborn, child, and adolescent health and nutrition (RMNCAH-N) outcomes. This set of initiatives and technical support is referred to as Frequent Assessments and Health System Tools for Resilience (FASTR).

FASTR encompasses four technical approaches: (1) RMNCAH-N service use monitoring using routine health management information system (HMIS) data, (2) rapid-cycle health facility phone surveys, (3) high-frequency household phone surveys, and (4) follow-on analyses. This methodology documentation focuses specifically on the first approach: RMNCAH-N service use monitoring.

RMNCAH-N service use monitoring

The GFF collaborates with Ministries of Health to conduct rapid-cycle analyses of routine HMIS data. This approach addresses three core objectives:

  1. Assess data quality at national and sub-national levels to identify and address completeness, accuracy, and consistency issues
  2. Track service utilization changes by measuring monthly shifts in priority RMNCAH-N health service volumes
  3. Monitor coverage progress by comparing service delivery trends against country-specific targets and benchmarks

These analyses focus on priority indicators tied to national health reforms and World Bank investments, with findings informing country planning processes and project implementation cycles. During the COVID-19 pandemic, the GFF supported Ministries of Health in over 20 countries to monitor the impact of the pandemic on essential health services using this approach.

Steps to implement RMNCAH-N service use monitoring.

Figure 1. Steps to implement RMNCAH-N service use monitoring

Why rapid-cycle analytics?

Existing health systems data sources are critical but often come with challenges that limit their use. HMIS data may not be analyzed promptly or may be perceived as too low-quality to use for decision making. Traditional in-person household and facility-based surveys demand extensive resources and time, with long lags between survey design, data collection, and the availability of findings. This prevents decision-makers from using data to drive meaningful improvements in health outcomes. To fill this gap, the GFF supports countries to develop and use rapid-cycle analytic approaches.


How does it work?

Rapid-cycle analytic approaches provide timely, rigorous, and high-priority data that respond to each country's specific priorities and data use needs. This continuous cycle of analyze-learn-strengthen-act seeks to improve the systematic use of data for decision-making towards improved RMNCAH-N outcomes.

FASTR rapid-cycle analytics framework showing the continuous cycle of analyze, learn, strengthen, and act.

Figure 2. FASTR's rapid-cycle analytics approach: Analyze, learn, strengthen, act

Technical approaches

FASTR's four technical approaches, underpinned by capacity strengthening and data use support, enable countries to use rapid-cycle analytics for strengthening PHC systems and improving RMNCAH-N outcomes through the timely and high-frequency analysis and use of data.

  1. Analysis of routine HMIS data assesses data quality, quantifies changes in priority health service volumes, and compares trends in service coverage to country targets for priority RMNCAH-N indicators.

  2. Rapid-cycle health facility phone surveys assess the performance of PHC facilities, monitor the implementation of reforms, identify the impact of shocks, and track changes over time. The phone survey is administered to a representative panel sample of PHCs over four quarterly contacts a year.

  3. High-frequency household phone surveys provide a snapshot of care seeking behavior, foregone care, financial protection, service coverage, and patient experience of care. Household surveys are currently done in partnership with the World Bank's Living Standards Measurement Study.

  4. Follow-on analyses employ root cause analysis and implementation research approaches to provide deeper understanding of issues uncovered by rapid-cycle analytics (e.g., explaining district-level performance variation, contextualizing the impact of health systems reforms, or investigating underlying causes of data quality issues and service delivery disruptions).

Illustrative capacity-building activities include support to automate the extraction, cleaning, and analysis of routine data and support to institutionalize rapid phone survey data collection and analysis approaches. Data use support prioritizes the integration of rapid-cycle analytics into existing data review and feedback mechanisms at national and subnational levels to strengthen the systematic use of data for decision making.

Rapid-cycle analytics under the Frequent Assessments and System Tools for Resilience (FASTR) initiative.

Figure 3. Rapid-cycle analytics under the Frequent Assessments and System Tools for Resilience (FASTR) initiative

How do countries use FASTR?

FASTR is designed to support country-defined policy questions using routine and survey data. Different countries enter FASTR through different pathways depending on their priorities. Some countries — such as Sierra Leone, Burkina Faso, Zambia, and Liberia — began with service continuity monitoring, triggered by changes in external financing for the health sector. Others — such as Nigeria, Ghana, and DRC — initiated FASTR to answer different priority questions, with disruption analysis as complementary work.

Regardless of the entry point, FASTR enables countries to monitor service continuity and recovery, identify geographic or service-specific challenges, and inform prioritization, planning, and policy dialogue.

From analysis to action

FASTR outputs — whether DQA scores, service use trends, or coverage estimates — are starting points, not endpoints. Each output triggers a cycle of investigation and action: findings raise questions, questions prompt investigation, investigation yields context, and context informs decisions. FASTR provides the evidence; stakeholders provide the context, judgement, and action.

Acronyms and abbreviations

Acronym Definition
Programs & organizations
DHS The Demographic and Health Surveys (DHS) Program
FASTR Frequent Assessments and Health System Tools for Resilience
GFF Global Financing Facility
MICS Multiple Indicator Cluster Surveys
MoH Ministry of Health
UN United Nations
Data systems & sources
DHIS2 DHIS2 (also spelled DHIS 2, formerly District Health Information Software)
HMIS Health Management Information System
WPP World Population Prospects
Health indicators & services
ANC Antenatal Care
BCG Bacillus Calmette-Guérin (tuberculosis vaccine)
MCV Measles-Containing Vaccine
OPD Outpatient Department
Penta Pentavalent vaccine (diphtheria, tetanus, pertussis, hepatitis B, Haemophilus influenzae type b)
PHC Primary Healthcare
RMNCAH-N Reproductive, Maternal, Newborn, Child, and Adolescent Health and Nutrition
SBA Skilled Birth Attendant
Technical & statistical terms
DQA Data Quality Assessment
MAD Median Absolute Deviation
Geographic
LMIC Low- and Middle-Income Country
SSA Sub-Saharan Africa

Use of routine health management information system data in LMICs

Health-facility data collected through routine HMIS constitute a primary data source for assessing the performance of the health sector. HMIS data are widely used for a variety of purposes including health sector reviews, planning and resource allocation, program monitoring, health care quality improvement, and reporting purposes. Ministries of Health in low- and middle-income countries (LMICs) are striving to ensure equitable access to quality health services and care towards attaining universal health coverage and other national strategies. The efforts can be more successful if decision making at all levels of the sector are well informed by timely, reliable, and comprehensive data gathered through a well-established health information system. Sound decisions are based on sound data; therefore, it is essential to ensure that the data are of good quality.

Poor-quality data impact various levels of the health system in different ways. For program managers, inaccurate data can lead to poor decisions that harm the program's operations and, ultimately, the population's health. At the planning level, poor-quality data can distort evidence of progress towards health-sector goals and hinder annual planning processes by providing misleading results. Additionally, when determining investments in the health sector, poor-quality data can lead to poor targeting of resources. Despite HMIS data being crucial for robust health systems, studies in Sub-Saharan Africa (SSA) have reported challenges with data quality, including issues with completeness, timeliness, accuracy, and consistency (AbouZahr & Boerma, 2005; Amoakoh-Coleman et al., 2015; Belay & Lippeveld, 2013; Gimbel et al., 2011; Mavimbe et al., 2005; Moukénet et al., 2021; Mutale et al., 2013; Rowe, 2009; Sychareun et al., 2014; Teklegiorgis et al., 2016)1 2 3 4 5 6 7 8 9 10. These concerns about the quality of routine information have undermined its use in decision-making within the health sector (Belay & Lippeveld, 2013; Bhattacharya et al., 2019; Chen et al., 2014; Endriyas et al., 2019; Glèlè Ahanhanzo et al., 2014; Mutale et al., 2013; Mwangu, 2005; Nshimyiryo et al., 2020; O'Hagan et al., 2017; Ouedraogo et al., 2019; Rowe et al., 2009; Xiao et al., 2017)4 9 11 12 13 14 15 16 17 18 19 20. However, in recent years, countries have made substantial improvements in HMIS data quality which has been reinforced by a system of data quality assessment, data quality improvement, and data use for evidence-based decision making (Mphatswe et al., 2012; Nisingizwe et al., 2014; Wagenaar et al., 2015)21 22 23.

Defining data quality

Defining data quality is complex, and while there is no one single definition of data quality, there are four dimensions most frequently used to describe it: completeness, timeliness, consistency, and accuracy (World Health Organization, 2017)24.

Data quality dimensions and assessment

Data quality domain What does it measure? How is it assessed?
Completeness Are all data present? Is there sufficient information available to make decisions about the health of the population and to target resources to improve health-system coverage, efficiency and quality? • Assessed by measuring whether all units that are supposed to report actually do (reporting completeness)

• Assessed by measuring the completeness of indicator data (no missing values); this is different from overall reporting completeness in that it looks at completeness of specific data elements and not only at the receipt of the monthly reporting form (indicator completeness)
Timeliness Are data regularly submitted on time? • Assessed by measuring whether the units that submitted reports did so before a set deadline (timeliness)
Consistency Are data plausible in view of what has been previously reported? • Trends are evaluated to determine whether reported values are extreme relative to other values reported during the year or across several years (presence of outliers)

• Assess trends in program indicators to determine whether reported values are extreme in relation to other values that are reported during the year or over several years (consistency over time)

• Assess program indicators which have a predictable relationship to determine whether the expected relationship exists between those 2 indicators (consistency between related indicators)

• Assess the level of agreement between two sources of data measuring the same health indicator; the two sources of data that are usually compared are data flowing through the HMIS and data from a periodic population-based survey (external comparison with other data sources)

• Determine the adequacy of the population data used in evaluating the performance of health indicators by comparing two different sources of related population estimates for congruence (consistency of population data)
Accuracy Do data faithfully reflect the actual level of service delivery conducted in the health facility? • Assess the accuracy for selected indicators through the review of source documents in health facilities and comparison to monthly reports and HMIS values (consistency of reported data and original records, data verification factor)

FASTR approach to routine data analysis

The FASTR approach to routine data analysis takes a three-pronged approach:

  1. Identify issues in data quality
  2. Adjust for issues with data quality to improve analysis accuracy
  3. Analyze data to answer pressing country-specific policy questions including identifying changes in priority service volumes and trends in service coverage as compared to country priorities and targets

This approach enables identification of the highest priority data quality issues and subsequent necessary analytical adjustments so that data can be continually improved while appropriate analyses are conducted. Data quality assessment is conducted by indicator and can be disaggregated at sub-national level given facility-level data is used for the analysis. This is important to generate policy relevant regular reporting on data quality, service volume, and coverage estimates which provides a continual snapshot of RMNCAH-N service use.


Focus on a set of core indicators

The FASTR approach to routine data analysis focuses on a core set of RMNCAH-N indicators that characterize the reproductive, maternal and child healthcare continuum, priority health areas across LMICs. These indicators capture key service delivery events, which have higher completeness rates and higher volume. In addition, these indicators serve as proxies for other services and interventions delivered at the same service contact. In addition, outpatient consultations (OPDs) are used as a proxy for the general use of health services. Additional, country and program-specific indicators can be added to the analysis to be responsive to country priorities.


Focus on a set of core data quality metrics

The FASTR approach to routine data analysis focuses on a core set of data quality metrics which enables identification of the highest priority data quality issues for which data quality adjustments can be made. In addition to the core data quality measures, the FASTR approach generates an overall data quality score which combines the core metrics into a single summary measure.

Results communication and data use

During the COVID-19 pandemic, the GFF supported Ministries of Health in over 20 countries to monitor the impact of the pandemic on essential health services.

More results and reports can be found in the FASTR Resource Repository.

What this documentation covers

This methodology documentation describes the complete FASTR approach to routine HMIS data analysis, from initial planning through results communication.

Planning & preparation

Analytics modules (FASTR platform)

The FASTR analytics platform includes four automated modules:

  • Data quality assessment - Module 1 in the platform. Assessment of HMIS data quality through completeness, outlier detection, and consistency metrics
  • Data quality adjustment - Module 2 in the platform. Techniques for improving data accuracy by adjusting for outliers and incomplete reporting
  • Service utilization analysis - Module 3 in the platform. Analysis of health service usage patterns to detect and quantify disruptions
  • Coverage estimates - Module 4 in the platform. Methods for estimating service coverage and comparing trends to country targets

References



Last updated: 24-02-2026 Contact: FASTR Project Team


  1. AbouZahr, C., & Boerma, T. (2005). Health information systems: The foundations of public health. Bulletin of the World Health Organization, 83(8), 578--583. 

  2. Mavimbe, J. C., Braa, J., & Bjune, G. (2005). Assessing immunization data quality from routine reports in Mozambique. BMC Public Health, 5, 108. https://doi.org/10.1186/1471-2458-5-108 

  3. Sychareun, V., Hansana, V., Phengsavanh, A., Chaleunvong, K., Eunyoung, K., & Durham, J. (2014). Data verification at health centers and district health offices in Xiengkhouang and Houaphanh Provinces, Lao PDR. BMC Health Services Research, 14, 255. https://doi.org/10.1186/1472-6963-14-255 

  4. Mutale, W. et al. (2013). Improving health information systems for decision making across five sub-Saharan African countries: Implementation strategies from the African Health Initiative. BMC Health Services Research, 13(2), S9. https://doi.org/10.1186/1472-6963-13-S2-S9 

  5. Amoakoh-Coleman, M. et al. (2015). Completeness and accuracy of data transfer of routine maternal health services data in the greater Accra region. BMC Research Notes, 8, 114. https://doi.org/10.1186/s13104-015-1058-3 

  6. Gimbel, S. et al. (2011). An assessment of routine primary care health information system data quality in Sofala Province, Mozambique. Population Health Metrics, 9, 12. https://doi.org/10.1186/1478-7954-9-12 

  7. Teklegiorgis, K., Tadesse, K., Terefe, W., & Mirutse, G. (2016). Level of data quality from Health Management Information Systems in a resources limited setting and its associated factors, eastern Ethiopia. South African Journal of Information Management, 18(1), 1--8. https://doi.org/10.4102/sajim.v18i1.612 

  8. Rowe, A. K. (2009). Potential of integrated continuous surveys and quality management to support monitoring, evaluation, and the scale-up of health interventions in developing countries. American Journal of Tropical Medicine and Hygiene, 80(6), 971. https://doi.org/10.4269/ajtmh.2009.80.971 

  9. Belay, H., & Lippeveld, T. (2013). Inventory of PRISM framework and tools: Application of PRISM tools and interventions for strengthening routine health information system performance (Working Paper Series WP-13-138). MEASURE Evaluation, Carolina Population Center. https://www.measureevaluation.org/resources/publications/wp-13-138/at\_download/document 

  10. Moukénet, A. et al. (2021). Health management information system (HMIS) data quality and associated factors in Massaguet district, Chad. BMC Medical Informatics and Decision Making, 21(1), 326. https://doi.org/10.1186/s12911-021-01684-7 

  11. Xiao, Y. et al. (2017). Challenges in data quality: The influence of data quality assessments on data availability and completeness in a voluntary medical male circumcision programme in Zimbabwe. BMJ Open, 7(1), e013562. 

  12. O'Hagan, R. et al. (2017). National assessment of data quality and associated systems-level factors in Malawi. Global Health Science and Practice, 5(3), 367--381. https://doi.org/10.9745/GHSP-D-17-00177 

  13. Chen, H., Hailey, D., Wang, N., & Yu, P. (2014). A review of data quality assessment methods for public health information systems. International Journal of Environmental Research and Public Health, 11(5), 5170--5207. https://doi.org/10.3390/ijerph110505170 

  14. Glèlè Ahanhanzo, Y., Ouedraogo, L. T., Kpozèhouen, A., Coppieters, Y., Makoutodé, M., & Wilmet-Dramaix, M. (2014). Factors associated with data quality in the routine health information system of Benin. Archives of Public Health, 72(1), 25. https://doi.org/10.1186/2049-3258-72-25 

  15. Bhattacharya, A. A. et al. (2019). Quality of routine facility data for monitoring priority maternal and newborn indicators in DHIS2: A case study from Gombe State, Nigeria. PLoS ONE, 14(1), e0211265. https://doi.org/10.1371/journal.pone.0211265 

  16. Nshimyiryo, A. et al. (2020). Health management information system (HMIS) data verification: A case study in four districts in Rwanda. PLoS ONE, 15(7), e0235823. https://doi.org/10.1371/journal.pone.0235823 

  17. Ouedraogo, M. et al. (2019). A quality assessment of Health Management Information System (HMIS) data for maternal and child health in Jimma Zone, Ethiopia. PLoS ONE, 14(3), e0213600. https://doi.org/10.1371/journal.pone.0213600 

  18. Endriyas, M. et al. (2019). Understanding performance data: Health management information system data accuracy in Southern Nations Nationalities and People's Region, Ethiopia. BMC Health Services Research, 19, 1--6. https://doi.org/10.1186/s12913-019-3991-7 

  19. Mwangu, M. (2005). Quality of a routine data collection system for health: Case of Kinondoni district in the Dar es Salaam region, Tanzania. South African Journal of Information Management, 7(2). 

  20. Rowe, A. K., Kachur, S. P., Yoon, S. S., Lynch, M., Slutsker, L., & Steketee, R. W. (2009). Caution is required when using health facility-based data to evaluate the health impact of malaria control efforts in Africa. Malaria Journal, 8(1), 209. https://doi.org/10.1186/1475-2875-8-209 

  21. Nisingizwe, M. P. et al. (2014). Toward utilization of data for program management and evaluation: Quality assessment of five years of health management information system data in Rwanda. Global Health Action, 7(1), 25829. https://doi.org/10.3402/gha.v7.25829 

  22. Wagenaar, B. H., Sherr, K., Fernandes, Q., & Wagenaar, A. C. (2015). Using routine health information systems for well-designed health evaluations in low- and middle-income countries. Health Policy and Planning, czv029. https://doi.org/10.1093/heapol/czv029 

  23. Mphatswe, W. et al. (2012). Improving public health information: A data quality intervention in KwaZulu-Natal, South Africa. Bulletin of the World Health Organization, 90(3), 176--182. https://doi.org/10.2471/blt.11.092759 

  24. World Health Organization. (2017). Data quality assurance: A toolkit for facility data quality assessment: Module 1: Framework and metrics. World Health Organization. https://iris.who.int/bitstream/handle/10665/366086/9789240047358-eng.pdf?sequence=1