Clinical Question: Is reticulocyte hemoglobin content the preferred screening tool for the early detection of iron deficiency in infants?
Setting: Outpatient (any)
Study Design: Cohort (prospective)
Synopsis: Iron deficiency in infants is linked with impaired mental and motor development that may be long lasting. Relying on the measurement of hemoglobin misses children with early iron deficiency who are not yet anemic. The authors of this study prospectively monitored 202 healthy infants nine to 12 months of age to compare reticulocyte hemoglobin content with hemoglobin in screening for iron deficiency. Hemoglobin is derived from the entire population of red blood cells, each with a lifespan of approximately 120 days, but the reticulocyte-dependent hemoglobin measurement relies on only the 24- to 48-hour lifespan of the reticulocyte. Monitoring of 84 percent of the infants occurred for a median duration of 5.6 months. The authors did not specify whether the results were interpreted blindly, but the same automated hematology analyzer performed all tests, so it is likely they were. A total of 23 infants (11.4 percent) had iron deficiency (serum transferrin saturation less than 10 percent), and six (3 percent) had iron deficiency and anemia (hemoglobin less than 11 g per dL [110 g per L]). The optimal cutoff level for detecting iron deficiency with reticulocyte hemoglobin content was 27.5 pg (sensitivity = 83 percent; specificity = 72 percent). Screening with a threshold hemoglobin level of less than 11 g per dL resulted in a sensitivity of 26 percent and a specificity of 95 percent.
Bottom Line: A low reticulocyte hemogobin content has a higher sensitivity for the accurate detection of early iron deficiency in infants than a standard hemoglobin measurement. Randomized trials comparing infants undergoing screening with either technique or no screening at all are now necessary to assess the long-term value of screening. (Level of Evidence: 2b)