BACKGROUND: High school students 16 to 18 years-old contribute 10% of the US blood supply. Mitigating iron depletion in these donors is important because they continue to undergo physical and neurocognitive development.
STUDY DESIGN AND METHODS: Study objectives were to determine the prevalence of iron depletion in 16- to 18-year-old donors and whether their risk for iron depletion was greater than adult donors. Successful, age-eligible donors were enrolled from high school blood drives at two large US blood centers. Plasma ferritin testing was performed with ferritin less than 12 ng/mL as our primary measure of iron depletion and ferritin less than 26 ng/mL a secondary measure. Multivariable repeated-measures logistic regression models evaluated the role of age and other demographic/donation factors.
RESULTS: Ferritin was measured from 4265 enrollment donations September to November 2015 and 1954 follow-up donations through May 2016. At enrollment, prevalence of ferritin less than 12 ng/mL in teenagers was 1% in males and 18% in females making their first blood donation, and 8% in males and 33% in females with prior donations. Adjusted odds for ferritin less than 12 ng/mL were 2.1 to 2.8 times greater in 16- to 18-year-olds than in 19- to 49-year-olds, and for ferritin less than 26 ng/mL were 3.3- to 4.7-fold higher in 16- to 18-year-olds. Progression to hemoglobin deferral was twice as likely in 16- to 18-year-old versus 19- to 49-year-old females.
CONCLUSION: Age 16 to 18 years-old is an independent risk factor for iron deficiency in blood donors at any donation frequency. Blood centers should implement alternate eligibility criteria or additional safety measures to protect teenage donors from iron depletion.