Human immunodeficiency virus (HIV) infection and AIDS are responsible for substantial morbidity and mortality in the United States. In 2011, there were nearly 50,000 HIV infection diagnoses, at a rate of 15.8 per 100,000 persons.1 In 2013, the U.S. Preventive Services Task Force (USPSTF) strongly recommended universal screening for HIV infection in all pregnant women, and in adolescents and adults 15 to 65 years of age.2 Screening for HIV infection meets many of the criteria of a worthwhile screening test, including earlier detection to improve morbidity and mortality, high sensitivity and specificity, and a testing method that is acceptable to most patients.
Although the American Academy of Family Physicians (AAFP) agrees with much of this recommendation, it recommends delaying the start of routine screening until 18 years of age, and screening all patients 18 to 65 years of age. The AAFP notes that screening younger patients may still be beneficial based on high-risk behaviors.3 The AAFP position differs from that of the USPSTF for the following reasons: the rate of HIV infection diagnosis is substantially lower in those 15 to 17 years of age; higher-risk adolescents can still be tested based on risk factors and regional prevalence; the positive predictive value of a test in a low-prevalence population is poor; a low-yield test would lead to potentially unnecessary, time-consuming interventions in the practice of a busy family physician; and the benefit of diagnosis and treatment in persons younger than 18 years is uncertain.
A screening test is less useful in a population with a low prevalence of the disease. Data from the Centers for Disease Control and Prevention from 2010 indicate an estimated annual HIV infection diagnosis rate of 1.1 per 100,000 15-year-olds, 3.3 per 100,000 16-year-olds, and 8 per 100,000 17-year-olds (Unpublished data, e-mail communication with the HIV Incidence and Case Surveillance Branch of the Centers for Disease Control and Prevention, December 20, 2012).
Although current HIV screening tests are highly sensitive and specific, the low diagnosis rate in adolescents 15 to 17 years of age has a significant impact on the utility of the test. The USPSTF describes the sensitivity and specificity of the two-step screening process (immunofluorescent assay followed by Western blot) as greater than 99.5%,2 whereas other sources have cited up to 99.7%.4 Using these estimates, the positive predictive value of HIV testing in 15- to 17-year-olds is 2.6%. In other words, 97.4% of positive test results are false positive. This means that there are 37 false-positive test results for every true-positive result. In contrast, in persons 20 to 24 years of age, the annual HIV infection diagnosis rate is 36.4 per 100,000,1 and the positive predictive value of HIV testing is 11%, or 8 false-positive test results for every true-positive result. These numbers do not consider regional variations,1,5 which can result in an even lower positive predictive value in areas with lower diagnosis rates, such as the Midwest and northwestern United States.
False-positive test results can lead to significant emotional distress for teenagers and their families before confirmatory tests can be performed. This could affect the physician-patient relationship, as well as influence the patient's willingness to undergo future screening tests. Screening yields can be improved by risk factor–based screening. Family physicians should, and typically do, address sexuality with adolescents.6 Although studies on the accuracy of asking teenagers about their sexual behavior are limited, risk factor–based screening identifies 75% to 80% of persons with infection.2
Physicians should counsel patients appropriately when screening for HIV infection. Patient-centered pretest counseling is time consuming and often poorly understood by patients and physicians.7 Physicians must explain the screening rationale, describe the significance of a positive result, and contact the patient with results. In a health care system with an increasing reliance on primary care physicians and an inadequate number of physicians to meet the demand,8 time is already a limited resource.9 Additionally, the benefits of detecting HIV infection in a 15- to 17-year-old patient vs. detecting the infection in the same adolescent at 18 years of age is unknown.
There are approximately 12 million 15- to 17-year-olds in the United States.10 The effort and cost to universally screen for HIV infection in this low-risk population is substantial, with minimal demonstrated benefit. Family physicians should screen all high-risk individuals for HIV infection, including adolescents engaging in high-risk behavior. Screening all patients 18 to 65 years of age at least once is also more likely to be beneficial than harmful. However, the AAFP believes that screening patients 15 to 17 years of age in the general population offers little benefit with substantial additional effort and cost, and should not be prioritized.
EDITOR'S NOTE: Dr. Brown was the 2013 chair of the AAFP's Subcommittee on Clinical Preventive Services and is the 2014 chair of the Commission on Health of the Public and Science.