It's 2030. Your next patient is Mrs. Jones, a 60-year-old with heart failure. A computer program that simulates human conversation, or a chatbot, scheduled the visit after asking the patient questions and noting that weight measurements from her internet-connected scale had been increasing. She enrolled in a heart failure management program when an artificial intelligence (AI) prediction tool identified her as being at high risk of an exacerbation this year. In addition to using the smart scale, a social worker, pharmacist, and dietitian ensure that the patient has everything she needs to manage her heart failure.
You walk into the room and hug her with both arms because you are not bringing your laptop to the encounter. You talk about her son, who recently died from cancer. During the encounter, an AI program that analyzes facial expressions by using video images recommends that you screen Mrs. Jones for depression. When answering the questions, she cries, recalling the sadness that has accompanied her son's death. You do not feel rushed because a program powered by AI is writing your notes and sending prescriptions for the medications you adjust.
Over the past decade, AI (i.e., technologies that perform tasks that normally require human intelligence) has been integrated into clinical decision support systems to provide timely information at the point of care and inform medical decision-making.1,2 Although this sounds like science fiction, the technology described exists and has the potential to enhance patient-physician relationships. Today, clinical decision support systems help family physicians document encounters, standardize orders, and identify deadly drug interactions. AI broadens the range of these systems and is already being used to help family physicians.3 For example, a digital assistant uses AI-powered voice recognition to generate notes and reduce the time spent on documentation by 62% (from 13.5 to five minutes).4 Chatbots compile symptoms to triage patients and can ensure that patients access primary care in a timely manner. AI interprets smartphone images to assist with the diagnosis of skin lesions and can reduce unnecessary referrals.5
Physicians are turning to AI because the computerization of health care has led to an avalanche of data and rising rates of burnout. With primary care physicians spending more time on documentation in electronic health records than on face time with patients,6 there is a disconnect between the healers we want to be and the data managers we have become. As AI performs tasks amenable to automation, the hope is that family physicians can focus on the responsibilities that cannot be easily replicated, such as building relationships, weighing preferences, and managing complexity.7
Although AI has the potential to be the solution that primary care needs to reclaim relationships, it could just as easily make things worse by leading to endless alerts, nonsensical notes, misdiagnoses, and data breaches. Critics argue that AI is already worsening disparities and magnifying biases. For instance, one study found that an algorithm trained on insurance claims was biased against Black patients.8 Despite having more comorbidities than White patients, Black patients were less likely to be referred to a care management program. Black patients have historically received fewer services because of institutional racism, so the algorithm predicted that their care would be less costly in the future and consequently referred them at a lower rate. Furthermore, because health care data can be incomplete or inaccurate, AI can make mistakes and cause harm.9 Like a physician acting on intuition, some programs cannot uncover the logic behind their decisions, raising concerns that errors can be repeated. All these issues can lead to worse health and higher costs.
To avoid these consequences, family physicians should participate in the AI revolution. We need to partner with researchers to tailor AI to primary care. We need to serve on health information technology committees within our hospitals and practices, or at least provide important feedback to these groups, and open our clinics to trials to ensure that AI works in primary care settings. As family physicians, we do not need to be AI experts to participate; instead, it is our primary care expertise that is needed. For example, the American Academy of Family Physicians is recruiting practices to test an AI digital assistant.10
It is also critical to connect with computer scientists and tell them about the problems that frontline family physicians face. The American Board of Family Medicine and Western University independently hosted meetings that brought together experts in AI and primary care to develop a research agenda and prioritize questions within this burgeoning field.11,12 The websites of professional primary care medical societies, including the American Academy of Family Physicians, American Board of Family Medicine, and the North American Primary Care Research Group, highlight opportunities for us to get involved and make our voices heard.11,13,14 Without our passion, insights, and clinical experiences, AI may follow the path of electronic health records and widen the divide between patients and physicians. Instead of being the antidote, AI could leave us feeling depleted. Which path it takes depends on what we do next.