It is increasingly apparent that access to healthcare without adequate quality of care is insufficient to improve population health outcomes. We assess whether the most commonly measured attribute of health facilities in low- and middle-income countries (LMICs)—the structural inputs to care—predicts the clinical quality of care provided to patients.

Methods and findings

Service Provision Assessments are nationally representative health facility surveys conducted by the Demographic and Health Survey Program with support from the US Agency for International Development. These surveys assess health system capacity in LMICs. We drew data from assessments conducted in 8 countries between 2007 and 2015: Haiti, Kenya, Malawi, Namibia, Rwanda, Senegal, Tanzania, and Uganda. The surveys included an audit of facility infrastructure and direct observation of family planning, antenatal care (ANC), sick-child care, and (in 2 countries) labor and delivery. To measure structural inputs, we constructed indices that measured World Health Organization-recommended amenities, equipment, and medications in each service. For clinical quality, we used data from direct observations of care to calculate providers’ adherence to evidence-based care guidelines. We assessed the correlation between these metrics and used spline models to test for the presence of a minimum input threshold associated with good clinical quality. Inclusion criteria were met by 32,531 observations of care in 4,354 facilities. Facilities demonstrated moderate levels of infrastructure, ranging from 0.63 of 1 in sick-child care to 0.75 of 1 for family planning on average. Adherence to evidence-based guidelines was low, with an average of 37% adherence in sick-child care, 46% in family planning, 60% in labor and delivery, and 61% in ANC. Correlation between infrastructure and evidence-based care was low (median 0.20, range from −0.03 for family planning in Senegal to 0.40 for ANC in Tanzania). Facilities with similar infrastructure scores delivered care of widely varying quality in each service. We did not detect a minimum level of infrastructure that was reliably associated with higher quality of care delivered in any service. These findings rely on cross-sectional data, preventing assessment of relationships between structural inputs and clinical quality over time; measurement error may attenuate the estimated associations.


Inputs to care are poorly correlated with provision of evidence-based care in these 4 clinical services. Healthcare workers in well-equipped facilities often provided poor care and vice versa. While it is important to have strong infrastructure, it should not be used as a measure of quality. Insight into health system quality requires measurement of processes and outcomes of care.

Author summary

Why was this study done?

  • Improved quality of care is increasingly recognized as a necessary step towards achievement of better population health outcomes in low- and middle-income countries.
  • Much of the current measurement effort focuses on inputs to care.
  • It is not known whether such measures provide insight on the quality of care delivered.

What did the researchers do and find?

  • We quantified facility infrastructure using international guidelines for readiness in each service applied to health facility audits in 8 countries; we defined quality of clinical care based on adherence to evidence-based protocols measured using direct observation in the same facility assessments.
  • We calculated the level and correlation of infrastructure and average adherence to guidelines for each of 4 clinical services in this sample: family planning (1,842 facilities), antenatal care (1,407 facilities), delivery care (227 facilities), and sick-child care (4,038 facilities).
  • Infrastructure scored higher than observed clinical quality in each service, and the correlation between the 2 was modest.

What do these findings mean?

  • Assessment of infrastructure is insufficient to estimate the quality of care delivered to women and children in need.
  • Measurement priorities should be reassessed to support more timely information for quality improvement purposes and more pertinent information on the quality of care delivered for monitoring and comparison.