smartwatch algorithm for automated detection of atrial ... · table 2 kb algorithm reading compared...
TRANSCRIPT
Listen to this manuscript’s
audio summary by
JACC Editor-in-Chief
Dr. Valentin Fuster.
J O U R N A L O F T H E AM E R I C A N C O L L E G E O F C A R D I O L O G Y V O L . 7 1 , N O . 2 1 , 2 0 1 8
ª 2 0 1 8 B Y T H E AM E R I C A N C O L L E G E O F C A R D I O L O G Y F O UN DA T I O N
P U B L I S H E D B Y E L S E V I E R
ORIGINAL INVESTIGATIONS
Smartwatch Algorithm forAutomated Detection of Atrial Fibrillation
Joseph M. Bumgarner, MD,a Cameron T. Lambert, MD,a Ayman A. Hussein, MD,a Daniel J. Cantillon, MD,aBryan Baranowski, MD,a Kathy Wolski, MPH,b Bruce D. Lindsay, MD,a Oussama M. Wazni, MD, MBA,a
Khaldoun G. Tarakji, MD, MPHa
ABSTRACT
ISS
Fro
Ce
we
inv
con
tai
ad
of
Ma
BACKGROUND The Kardia Band (KB) is a novel technology that enables patients to record a rhythm strip using an
Apple Watch (Apple, Cupertino, California). The band is paired with an app providing automated detection of atrial
fibrillation (AF).
OBJECTIVES The purpose of this study was to examine whether the KB could accurately differentiate sinus rhythm (SR)
from AF compared with physician-interpreted 12-lead electrocardiograms (ECGs) and KB recordings.
METHODS Consecutive patients with AF presenting for cardioversion (CV) were enrolled. Patients underwent pre-CV
ECG along with a KB recording. If CV was performed, a post-CV ECG was obtained along with a KB recording. The KB
interpretations were compared to physician-reviewed ECGs. The KB recordings were reviewed by blinded electrophysi-
ologists and compared to ECG interpretations. Sensitivity, specificity, and K coefficient were measured.
RESULTS A total of 100 patients were enrolled (age 68 � 11 years). Eight patients did not undergo CV as they were
found to be in SR. There were 169 simultaneous ECG and KB recordings. Fifty-seven were noninterpretable by the KB.
Compared with ECG, the KB interpreted AF with 93% sensitivity, 84% specificity, and a K coefficient of 0.77. Physician
interpretation of KB recordings demonstrated 99% sensitivity, 83% specificity, and a K coefficient of 0.83. Of the 57
noninterpretable KB recordings, interpreting electrophysiologists diagnosed AF with 100% sensitivity, 80% specificity,
and a K coefficient of 0.74. Among 113 cases where KB and physician readings of the same recording were interpretable,
agreement was excellent (K coefficient ¼ 0.88).
CONCLUSIONS The KB algorithm for AF detection supported by physician review can accurately differentiate AF
from SR. This technology can help screen patients prior to elective CV and avoid unnecessary procedures.
(J Am Coll Cardiol 2018;71:2381–8) © 2018 by the American College of Cardiology Foundation.
A trial fibrillation (AF) is the most commonlyencountered arrhythmia in clinical practiceand population-based studies forecast over
6 million individuals living with this diagnosis by
N 0735-1097/$36.00
m the aDepartment of Cardiovascular Medicine, Cleveland Clinic, Cleve
nter for Clinical Research (C5Research), Cleveland Clinic, Cleveland, Ohio
re connected to an Apple Watch and paired via Bluetooth to a smartphone
olved in the design, implementation, data analysis, or manuscript pre
sultant for Abbott and Biosense Webster. Dr. Cantillon has served as a c
nability, and LifeWatch. Dr. Wazni has received a speaker honorarium from
visory board of Medtronic and AliveCor. All other authors have reported tha
this paper to disclose.
nuscript received February 14, 2018; revised manuscript received March
2050 (1,2). It is a chronic condition whose prevalenceincreases with age, and represents a growing eco-nomic burden for our health care system (3,4).Although the journey of AF begins with an initial
https://doi.org/10.1016/j.jacc.2018.03.003
land, Ohio; and the bCleveland Clinic Coordinating
. AliveCor provided the Kardia Band monitors that
device for utilization in the study. AliveCor was not
paration of the study. Dr. Hussein has served as a
onsultant for Abbott, Boston Scientific, Stryker Sus-
Spectranetics. Dr. Tarakji has served on the medical
t they have no relationships relevant to the contents
1, 2018, accepted March 2, 2018.
ABBR EV I A T I ON S
AND ACRONYMS
AF = atrial fibrillation
CV = cardioversion
ECG = electrocardiogram
ILR = implantable loop
recorder
KB = Kardia Band
SR = sinus rhythm
Bumgarner et al. J A C C V O L . 7 1 , N O . 2 1 , 2 0 1 8
Smartwatch CV Study M A Y 2 9 , 2 0 1 8 : 2 3 8 1 – 8
2382
diagnosis, its management is long term,nuanced, and often involves hospital-basedinterventions along the way, including elec-trical cardioversion (CV).
Recently, commercially available hand-held cardiac rhythm recorders have beendeveloped that can record a rhythm stripusing smartphone technology (5). InNovember 2017, the Kardia Band (KB)(AliveCor, Mountain View, California) was
introduced as the first U.S. Food and Drug Adminis-tration (FDA)–cleared Apple Watch accessory (Apple,Cupertino, California) that allows a patient to record arhythm strip equivalent to lead I for 30 s. The KB iscoupled with an application that provides an instan-taneous and automatic rhythm adjudication algo-rithm for the diagnosis of AF. The application caninform the patient when AF is detected and transmitthese results to the patient’s caring physicianinstantaneously.
SEE PAGE 2389
The primary objective of our study was to examinewhether the KB and AF detection algorithm couldaccurately and reliably differentiate sinus rhythm (SR)from AF when compared with physician-interpreted12-lead ECGs and KB recordings in patients withknown AF presenting to a high-volume hospital-basedelectrophysiology practice for scheduled electrical CV.
METHODS
STUDY DESIGN. This was a prospective, non-randomized, and adjudicator-blinded studycompleted at a tertiary care hospital-based electricalCV laboratory designed to evaluate the accuracy ofthe KB automated algorithm for the detection of AF.AliveCor provided the KB connected to an AppleWatch which was paired via Bluetooth to a smart-phone device (Apple) for utilization in the study(Figure 1). The Cleveland Clinic’s Institutional ReviewBoard approved the study.
STUDY PARTICIPANTS. Consecutive patients with adiagnosis of AF who presented for scheduled electiveCV with or without a planned transesophageal echo-cardiogram were screened for enrollment. Inclusioncriteria included all adult patients age 18 to 90 yearswho were able to provide informed consent andwilling to wear the KB before and after CV. Weexcluded all patients with an implanted pacemaker ordefibrillator.
Once enrolled, patients underwent a pre-CV ECGfollowed immediately by KB recording. These paired
recordings were considered simultaneous. If the CVwas performed, a post-CV ECG was then obtainedalong with another KB recording. The KB tracing wasautomatically analyzed using the KB algorithm. Thisalgorithm measures rhythm irregularity and P-waveabsence in real time to classify the rhythm strip as“possible AF.” If the criteria for AF is not met, the KBalgorithm classifies regular rhythms with P waves as“normal” if the rate is between 50 and 100 beats/minor “unclassified” for those rhythms with rates <50 or>100 beats/min or if the recording is noisy or shorterthan 30 s. The KB rhythm strips were automaticallytransferred to the secure AliveCor server, down-loaded, and printed for review.
All automated KB rhythm strips and ECGs wereanonymized and distributed to 2 blinded electro-physiologists (BB and DC) who independently inter-preted each tracing and assigned a diagnosis of SR, AFor atrial flutter, or unclassified. If the 2 electrophysi-ologists disagreed on the diagnosis, a third electro-physiologist (AH) reviewed the tracing and assigned afinal diagnosis. To assess the accuracy of the KB al-gorithm at appropriately identifying AF, the auto-mated KB interpretations were compared with boththe physician-interpreted KB rhythm strips and thephysician-reviewed simultaneous ECGs.
STATISTICAL ANALYSIS. Sensitivity and specificitywere calculated for KB automated interpretationcompared with physician-interpreted 12-lead ECG, forphysician interpreted KB rhythm strip compared withphysician-interpreted 12-lead ECG, and for KB auto-mated interpretation compared with physician-interpreted KB recordings. Kappa (k) coefficients forinterobserver agreement were assessed. k coefficients>0.80 were considered to represent excellent agree-ment. AF and atrial flutter were considered a singledisease state for all interpretations.
RESULTS
A total of 100 patients were enrolled in the study fromMarch 2017 through June 2017. Demographics andclinical characteristics are summarized in Table 1. CVwas performed in 85% of study participants. Of the 15patients who did not undergo CV, 8 were cancelleddue to presentation in SR. There were 169 simulta-neous 12-lead ECG and KB recordings obtained fromstudy participants, and 57 KB recordings were deter-mined as unclassified by the KB algorithm. Of the 57unclassified KB tracings, 16 (28%) were due to base-line artifact and low amplitude of the recording,12 (21%) were due to a recording of <30 s in duration,6 (10%) were due to a heart rate of <50 beats/min,
TABLE 1 Demographics and Clinical and Procedural
Characteristics of Enrolled Patients (N ¼ 100)
Age, yrs 68.2 � 10.86
Female 17 (17.0)
Anticoagulant
Warfarin 32 (32.0)
Dabigatran 2 (2.0)
Rivaroxaban 19 (19.0)
Apixaban 47 (47.0)
TEE performed
Yes, scheduled 21 (21.0)
Yes, added on 2 (2.0)
TEE finding
No thrombus 20 (20.2)
Sludge 1 (1.0)
Thrombus 2 (2.0)
CV performed
Yes 85 (85.0)
No 15 (15.0)
Reason if no CV performed
Subtherapeutic INR 4 (26.7)
Found to be in NSR 8 (53.3)
Thrombus on TEE 2 (13.3)
Hypotension during TEE 1 (6.7)
CV outcome
Successful 78 (91.7)
Transient 3 (3.5)
Failed 4 (4.7)
Values are mean � SD or n (%).
CV ¼ cardioversion; INR ¼ international normalized ratio; NSR ¼ normal sinusrhythm; TEE ¼ transesophageal echocardiogram.
TABLE 2 KB Algorithm Reading Compared to
Electrophysiologist Interpreted 12-Lead ECG
KB AlgorithmInterpretation
Electrophysiologist-Interpreted12-Lead ECG
AF/Flutter SR Noninterpretable Total
AF/flutter 63 7 0 42
SR 5 37 0 70
Missing/unclassified 23 34 0 57
Total 91 78 0 169
Sensitivity, specificity, and k coefficient are calculated only for the simultaneoustransmission with interpretation (in bold). Sensitivity of 93% (63 of 68; 95%confidence interval: 86% to 99%), specificity of 84% (37 of 44; 95% confidenceinterval: 73% to 95%), and k coefficient of 0.77 (95% confidence interval 0.65 to0.89) for numbers in bold.
AF ¼ atrial fibrillation; ECG ¼ electrocardiogram; KB ¼ Kardia Band; SR ¼ sinusrhythm.
FIGURE 1 The Kardia Band From AliveCor Paired With an
Apple Smartwatch
AliveCor (Mountain View, California) Kardia Band monitors were
connected to an Apple smartwatch and paired via Bluetooth to
a smartphone device for utilization in the study.
J A C C V O L . 7 1 , N O . 2 1 , 2 0 1 8 Bumgarner et al.M A Y 2 9 , 2 0 1 8 : 2 3 8 1 – 8 Smartwatch CV Study
2383
5 (9%) were due to a heart rate of >100 beats/min, andthe remaining 18 (32%) were unclassified due to anunclear reason. Electrophysiologist-interpreted 12-lead ECGs were all interpretable.
To test the ability of the KB algorithm to detect AF,automated KB rhythm interpretations andelectrophysiologist-interpreted 12-lead ECGs werecompared. Among the recordings where the KB pro-vided a diagnosis, it correctly diagnosed AF with 93%sensitivity, 84% specificity, and a K coefficient of 0.77(95% confidence interval: 0.65 to 0.89) whencompared with the electrophysiologist-interpreted 12-lead ECG (Table 2). Because our analysis used multipleobservations from the same individual, we evaluatedfor possible intraindividual correlations by comparingonly pre-CV KB recordings to electrophysiologist-interpreted 12-lead ECGs and found the performanceof the KB algorithm to be unchanged (Online Table 1).
To determine whether the automated KB re-cordings labeled as “unclassified” by the algorithmwere still clinically useful, these tracings were inter-preted by our blinded electrophysiologists andcompared with the electrophysiologist-interpreted12-lead ECGs. Of the 57 automated unclassified KBrecordings, the interpreting electrophysiologists were
able to correctly diagnose AF with 100% sensitivity,80% specificity, and a K coefficient of 0.74 (Table 3).
To assess the fidelity and overall quality ofthe KB tracings produced by the smartwatch,electrophysiologist-interpreted KB recordings werecompared to corresponding 12-lead ECG tracings. Atotal of 22 recordings were determined to be
TABLE 3 Unclassified KB Readings When Read by
Electrophysiologist Compared to Electrophysiologist-Interpreted
12-Lead ECG
ElectrophysiologistInterpreted KB Reading
Electrophysiologist-Interpreted12-Lead ECG
AF/Flutter SR Noninterpretable Total
AF/flutter 14 5 0 19
SR 0 20 0 20
Missing/noninterpretable 9 9 0 18
Total 23 34 0 57
Sensitivity, specificity, and k coefficient are calculated only for the simultaneoustransmission with interpretation (in bold). Sensitivity of 100% (14 of 14; 95%confidence interval: 77% to 100%), specificity of 80% (20 of 25; 95% confidenceinterval: 64% to 96%), and k coefficient of 0.74 (95% confidence interval: 0.54 to0.95) for numbers in bold.
Abbreviations as in Table 2.
TABLE 5 KB Automated Reading Compared to
Electrophysiologist-Interpreted KB Recordings
KB AutomaticReading
Electrophysiologist-Interpreted KB Recordings
AF/Flutter SR Missing/Noninterpretable Total
AF/flutter 71 1 2 74
SR 5 36 2 43
Missing/unclassified 20 21 18 59
Total 96 58 22 176
Sensitivity, specificity, and k coefficient are calculated only for the simultaneoustransmission with interpretation (in bold). Sensitivity of 93% (71 of 76; 95%confidence interval: 88% to 99%), specificity of 97% (36 of 37; 95% confidenceinterval: 92% to 100%), and k coefficient of 0.88 (95% confidence interval:0.79-0.97) for numbers in bold.
Abbreviations as in Table 2.
Bumgarner et al. J A C C V O L . 7 1 , N O . 2 1 , 2 0 1 8
Smartwatch CV Study M A Y 2 9 , 2 0 1 8 : 2 3 8 1 – 8
2384
noninterpretable by the reading electrophysiologist,and these were predominately due to baseline arti-fact. Of the remaining 147 simultaneous recordings,the electrophysiologist interpreted 12-lead ECGs andelectrophysiologist interpreted KB recordings, physi-cian interpretation of the KB tracings demonstrated99% sensitivity, 83% specificity and a K coefficient of0.83 (Table 4).
Additionally, to measure the quality of the KB re-cordings, we compared the KB automated algorithminterpretation to physician interpretation of the samerecordings. Of the cases where both methods wereinterpretable, the KB automated algorithm was 93%sensitive and 97% specific in detecting AF with a Kcoefficient of 0.88 (Table 5).
DISCUSSION
The era of mobile health care technology hasproliferated over the past decade. Consumers fromthe general public now have direct access to devicesand applications that offer real-time measurements
TABLE 4 Electrophysiologist-Interpreted KB Reading Comparedto Electrophysiologist-Interpreted 12-Lead ECG
Electrophysiologist-InterpretedKB Reading
Electrophysiologist-Interpreted12-Lead ECG
AF/Flutter SR Noninterpretable Total
AF/flutter 80 11 0 91
SR 1 55 0 56
Missing/noninterpretable 10 12 0 22
Total 91 78 0 169
Sensitivity, specificity, and k coefficient are calculated only for the simultaneoustransmission with interpretation (in bold). Sensitivity of 99% (80 of 81; 95%confidence interval: 96% to 100%), specificity of 83% (55 of 66; 95% confidenceinterval: 74% to 92%). k coefficient of 0.83 (95% confidence interval: 0.74 to0.92) for numbers in bold.
Abbreviations as in Table 2.
of cardiovascular physiology, and some technologiesextrapolate this data to provide diagnostic infor-mation (6). It is estimated that by 2019, annual salesof such devices will reach 50 billion dollars world-wide (7). However, the ability of some devices toaccurately measure biometric endpoints has beenquestioned, and some mobile health technologiesare available without verification through rigorousclinical studies (8).
Alongside the growth of mobile health care tech-nology has been the desire of many physicians andpatients to accurately monitor disease-related met-rics of chronic conditions in the ambulatory setting.AF is a good example of a relapsing condition thatrequires frequent monitoring of clinical endpoints toassess the efficacy of treatment choices and planfuture interventions. The KB is the first smartwatchaccessory cleared by the FDA and available to thegeneral public without a prescription that claims toinstantaneously detect AF and transmit this infor-mation to a patient’s treating physician.
In this study, we aimed to assess whether the KBand AF detection algorithm could accurately andreliably differentiate SR from AF in patients withknown AF presenting for scheduled electrical CV(Central Illustration). We compared automated KBinterpretations to simultaneously recorded ECGsread by blinded electrophysiologists and found verygood agreement between them. When able to providean interpretation, the automated KB readingscorrectly identified AF with 93% sensitivity and 84%specificity (Figure 2). Of the 169 total KB recordings,57 (33.7%) were interpreted as unclassified bythe automated KB algorithm. Reasons that theserecordings were deemed noninterpretable includedshort recordings <30 s, low-amplitude P waves,and baseline artifact. For those recordings where theautomatic KB tracing was noninterpretable, direct
CENTRAL ILLUSTRATION Automated Atrial Fibrillation Detection Algorithm Using Novel Smartwatch Technology
60
3
60
3
The app informs the patient if AF is detected;the results are transmitted to the patient’s physician
if it does not meet certain criteria
Patient places thumb on the sensor to record rhythm
Method for interpretingthe recording:
App algorithm only
Physician only
Recordings labeled as
algorithm when reviewedby physician
99% sensitivity; 83% specificity
% of patients withinterpretable results
Accuracy of AF diagnosis comparedto 12 lead electrocardiogram
66%
87%
100%
The smartwatch strap with an electrode sensorthat records heart rhythm
Bumgarner, J.M. et al. J Am Coll Cardiol. 2018;71(21):2381–8.
Assessment of the accuracy of the KB (Kardia Band) smartwatch algorithm for AF detection compared with 12-lead ECG (electrocardiogram) in patients
undergoing cardioversion. Automated KB recordings are compared to physician-interpreted 12-lead ECGs and detect AF with 93% sensitivity and 84% specificity.
Physician-interpreted KB recordings are compared to physician-interpreted 12-lead ECGs and detect AF with 99% sensitivity and 83% specificity. Physician-reviewed
unclassified automated KB recordings are compared to physician-interpreted 12-lead ECGs and detect AF with 100% sensitivity and 80% specificity. A total of
22 physician-interpreted KB recordings were noninterpretable.
J A C C V O L . 7 1 , N O . 2 1 , 2 0 1 8 Bumgarner et al.M A Y 2 9 , 2 0 1 8 : 2 3 8 1 – 8 Smartwatch CV Study
2385
physician interpretation could be used to correctlyidentify AF with 100% sensitivity and 80% specificity(Figure 3). In general, the KB recordings when inter-preted by the physician had excellent agreement withsimultaneous 12-lead ECG interpretation with 99%sensitivity and 83% specificity.
Prior to the development of the KB smartwatchalgorithm, several algorithms used by implantableloop recorders (ILRs) were validated for the detec-tion of AF. Currently available ILRs detect AF bysensing R waves and applying a variety of regularityalgorithms to detect AF. The Confirm DM2101(Abbott, Chicago, Illinois) detects RR interval regu-larity and measures suddenness of an irregularrhythm’s onset and offset to diagnose AF using 2probabilistic scoring models. The BioMonitor
(Biotronik, Berlin, Germany) also measures R-wavevariability and allows the clinician to adjust thenumber of cycle lengths used and the confirmationtime needed to detect AF. The most studied of theILRs is the Reveal LINQ (Medtronic, Minneapolis,Minnesota) system whose algorithm for AF detectionuses both R-wave irregularity and a programmableP-wave evidence discrimination tool that can bemodified based on the individual needs of a givenpatient (9–11). The Reveal LINQ system was evalu-ated in the XPECT (Reveal XT Performance Trial). Inthis study, the sensitivity and specificity for identi-fying patients with any AF was 96.1% and 85.4%,respectively (12). In our study, the accuracy of theKB algorithm for the detection of AF was comparableto these results.
FIGURE 2 Correct KB Interpretations Compared to Simultaneous ECG
(A) Simultaneous recordings of SR using KB (left) and 12-lead ECG (right). The KB automated algorithm identifies SR for this sample. (B)
Simultaneous recordings of AF using KB (left) and 12-lead ECG (right). The KB automated algorithm identifies AF for this sample. AF ¼ atrial
fibrillation; ECG ¼ electrocardiogram; KB ¼ Kardia Band; SR ¼ sinus rhythm.
Bumgarner et al. J A C C V O L . 7 1 , N O . 2 1 , 2 0 1 8
Smartwatch CV Study M A Y 2 9 , 2 0 1 8 : 2 3 8 1 – 8
2386
Wearable devices like the KB require a safe anddurable platform upon which recordings can bereviewed and stored. A secure cloud-based platformhas been developed to view and download KB re-cordings. The applicability of this platform to theoutpatient management of patients with AF needs tobe evaluated and studied in future trials. Our studyalso demonstrated that a subset of patients (8%) whopresented for CV was found to be in SR. For each ofthese patients, the automated KB algorithm did noterroneously identify AF, and the physician interpre-tation of the KB recording correctly confirmed SR ineach case. Although this study was not powered toassess the financial consequences of cancelled CVs, itis reasonable to conclude that a measurable numberof resources were forfeited by both the patient andthe health care system in anticipation of a procedurethat was ultimately deemed unnecessary once SR wasconfirmed. As data from the KB can be reviewedremotely, the resources used in preparation of these
patients’ cancelled CVs could have been saved. TheKB system has been previously shown to be cost-effective for AF screening. Our study suggests thepotential use of these products to provide moreeffective health care delivery (13).
STUDY LIMITATIONS. This was a single-center studyat a tertiary referral center with a small sample size.The population represented in this study had aknown history of AF and a sufficient burden of AF toprompt electrical CV. The performance of the KBsmartwatch algorithm may be more variable in apopulation with a lower AF burden. We did notevaluate socioeconomic status in our study, and only17% of our enrolled patients were female. Addition-ally, none of the patients who participated in ourstudy had previously used the KB. These facts maylimit the generalizability of our findings in the gen-eral public, and future studies should considermeasuring these variables. Patients with cardiacimplantable electronic devices were excluded from
FIGURE 3 Incorrect KB Interpretations Compared to Simultaneous ECG
(A) Heart rhythm recording defined by KB (left) as unclassified with simultaneous 12-lead ECG (right) interpreted as SR. (B) Heart rhythm
recording defined by KB (left) as unclassified with simultaneous 12-lead ECG (right) interpreted as AF. (C) Heart rhythm recording defined by
KB (left) as too short to analyze with simultaneous 12-lead ECG (right) interpreted as AF. Abbreviations as in Figure 2.
J A C C V O L . 7 1 , N O . 2 1 , 2 0 1 8 Bumgarner et al.M A Y 2 9 , 2 0 1 8 : 2 3 8 1 – 8 Smartwatch CV Study
2387
this study, and further evaluation of the KB algorithmis needed in this patient population. Participantswere instructed on how to use the KB wristbandwhile seated in a hospital bed immediately prior to
obtaining each recording. Their ability to record eachtracing was directly observed. As a result, theperformance of the KB algorithm and the clarity ofthe recorded tracings may be less accurate in an
PERSPECTIVES
COMPETENCY IN PATIENT CARE: Amongpatients
with AF undergoing elective cardioversion, an auto-
mated smartwatch algorithm with physician oversight
accurately differentiates between AF and SR.
TRANSLATIONAL OUTLOOK: As the prevalence of
AF rises and access to mobile health care technology
expand, randomized trials are needed to validate the
sensitivity, specificity, and generalizability and define
the clinical utility of personalized wearable technolo-
gies for arrhythmia detection.
Bumgarner et al. J A C C V O L . 7 1 , N O . 2 1 , 2 0 1 8
Smartwatch CV Study M A Y 2 9 , 2 0 1 8 : 2 3 8 1 – 8
2388
outpatient or ambulatory setting. For the samereason, some of the unclassified recordings couldhave been avoided with more patient practice on theproper use of the KB device. Additionally, the KBprototype used in our study did not display a real-time ECG tracing on the watch screen at the time ofrecording. Since FDA clearance, the KB app is nowpermitted to display this information. We anticipatethe real-time display of the ECG recording willimprove the quality of the recordings obtained byusers of the device.
CONCLUSIONS
The KB smartwatch automated algorithm for AFdetection, supported by physician review of theserecordings, can reliably differentiate AF from SR.Avoiding scheduling unnecessary electrical CVs is1 example of a clinical application of the KB system.Many other potential applications warrant furtherinvestigation and might transform our longitudinalcare of AF patients.
ADDRESS FOR CORRESPONDENCE: Dr. Khaldoun G.Tarakji, Section of Cardiac Pacing and Electrophysi-ology, Heart and Vascular Institute, Cleveland Clinic,9500 Euclid Avenue, J2-2, Cleveland, Ohio 44195.E-mail: [email protected].
RE F E RENCE S
1. Coyne KS, Paramore C, Grandy S, et al. Assess-ing the direct costs of treating nonvalvular atrialfibrillation in the United States. Value Health2006;9:348–56.
2. January CT, Wann LS, Alpert JS, et al. 2014AHA/ACC/HRS guideline for the management ofpatients with atrial fibrillation: a report of theAmerican College of Cardiology/American HeartAssociation Task Force on Practice Guidelines andthe Heart Rhythm Society. J Am Coll Cardiol 2014;64:e1–76.
3. Becker C. Cost-of-illness studies of atrial fibril-lation: methodological considerations. Expert RevPharmacoecon Outcomes Res 2014;14:661–84.
4. Wodchis WP, Bhatia RS, Leblanc K, et al.A review of the cost of atrial fibrillation. ValueHealth 2012;15:240–8.
5. Tarakji KG, Wazni OM, Callahan T, et al. Using anovel wireless system for monitoring patients af-ter the atrial fibrillation ablation procedure: theiTransmit study. Heart Rhythm 2015;12:554–9.
6. Freedman B. Screening for atrial fibrillationusing a smartphone: is there an app for that? J AmHeart Assoc 2016;5:e004000.
7. Piwek L, Ellis DA, Andrews S, et al. The rise ofconsumer health wearables: promises and barriers.PLoS Med 2016;13:e1001953.
8. Gillinov S, Etiwy M, Wang R, et al. Variableaccuracy of wearable heart rate monitors duringaerobic exercise. Med Sci Sports Exerc 2017;49:1697–703.
9. Lee R, Mittal S. Utility and limitations of long-term monitoring of atrial fibrillation using animplantable loop recorder. Heart Rhythm 2018;15:287–95.
10. Passman RS, Rogers JD, Sarkar S, et al.Development and validation of a dual sensingscheme to improve accuracy of bradycardia andpause detection in an insertable cardiac monitor.Heart Rhythm 2017;14:1016–23.
11. Mittal S, Rogers J, Sarkar S, et al. Real-worldperformance of an enhanced atrial fibrillation
detection algorithm in an insertable cardiacmonitor. Heart Rhythm 2016;13:1624–30.
12. Hindricks G, Pokushalov E, Urban L, et al.Performance of a new leadless implantable cardiacmonitor in detecting and quantifying atrial fibril-lation: results of the XPECT trial. Circ ArrhythmElectrophysiol 2010;3:141–7.
13. Lowres N, Neubeck L, Salkeld G, et al.Feasibility and cost-effectiveness of stroke pre-vention through community screening for atrialfibrillation using iPhone ECG in pharmacies. TheSEARCH-AF study. Thromb Haemost 2014;111:1167–76.
KEY WORDS atrial fibrillation,cardioversion, digital health, ECGmonitoring, smartwatch
APPENDIX For a supplemental table, pleasesee the online version of this paper.